How top brands are using a new technology to transform their sports sponsorships. Learn more
For Performance Marketing Teams
Understand and activate your entire fan universe in one customer data platform.
Lead Scoring & Segmentation
Identify and segment fans most likely to purchase.
For Data Teams
Bring disconnected data sources together in the StellarAlgo Data Lakehouse.
150+ integrations unlock fan data from multiple sources and destinations.
For Brand Partnerships
Discover new opportunities to drive value from partnerships.
Media and Gaming
LA Galaxy drive more than $532k in raw revenue
We built the world’s largest fan database to help properties and partners succeed.
Our team thrives on challenges
Get the latest updates
Meet our team of sports and
Join our team of all-stars.
Data Privacy and Security
Collect and protect data with confidence.
Oct 17, 2018
For this special blog we wanted to introduce you to another key member of our team – StellarAlgo CTO Sean Fynn. Sean brings an impressive resume in data solution architecture, having worked with different organizations from startups to multinational companies.
You have over 17 years of experience in the data space. Talk to us about how you’ve seen things change since you began your career.
When I first started in the 90s I was just excited about moving towards data processing on client server architecture and away from the mainframes. As we moved into the 2000s, the technologies that were available to developers allowed us to do so much, and to do it more quickly. Some of the changes that I’ve seen that have shaped my experiences in data solution architecture really started around the .com era, when my company offered SaaS (software-as-a-service) when that term didn’t even exist back then! The concept of a zero footprint data-intensive solution serving large amounts of data over the internet was something people were just starting to tackle. Google didn’t exist and Amazon only sold books. I think back on how much effort, how much work, and even just the struggle of trying to deliver that content, as well as process these amounts of data back in the early 2000s – it’s something that we just take for granted nowadays.
I was able to live through the change of the physical data center model, having to manage data centers, then migrating to database virtualization (which was pretty leading edge in the mid-2000s). Pretty much everything we tried to do wasn’t supported back then, but we were always pushing that envelope for both cost-effectiveness and user experience in terms of delivering high-performance solutions.
Since 2010, I have been moving everything I do into cloud providers such as Amazon’s AWS and Microsoft’s Azure IaaS and PaaS, I feel fortunate to understand the way it was so I don’t take for granted what we have today. Our young developers and data scientists who are just now beginning their careers really have no idea what it was like. What we do day-to-day at StellarAlgo would have taken huge amounts of hardware, time, and expense to be able to deliver probably a fraction of what we can deliver now. What that means for companies like ours is that we’re able execute on concepts and ideas that we could only dream about even as little as five years ago and bring innovative products and services in an affordable manner to the market.
You have data and systems experience in a multitude of industries like entertainment (movie distribution companies), retail, big box, electrical utilities, consumer marketing organizations and many SaaS startups. How do you find live audience businesses like sports, entertainment and attractions to be different data-wise?
When you start getting down to the deep technical level, there are many similarities – which is a good thing because we can bring together the experiences of professionals from different industries and apply it into any market. However the difference with the live audience space is around what we can do with data in terms of attempting to build that crystal ball and model the behaviors behind how our clients’ customers make their buying decisions.
Time sensitivity is another difference. Our clients need to be reacting very quickly on their data. We’re not just analyzing historical trends, we need to react on data that normally takes days or weeks to learn. They need the answers now. The exciting challenge for us is to be able to give those answers in a timely manner.
The emerging social media and secondary market industry are also an interesting challenge. In this industry, expectations are changing rapidly. There are regional differences in data sources and issues pop up in terms of compliance. All of these are providing new opportunities and new data, pushing us for innovations in data processing. It feels like a moving target which adds excitement.
Why did you decide to join StellarAlgo?
I’ve been in my official role as CTO just since this past summer, but I have been involved with StellarAlgo in some capacity for about two years. It’s a new SaaS company with a new product – right in my wheelhouse, which is exciting. I love working on product and I love working in the SaaS space. The fact that this company was that, and that it’s working on a product that leveraged interesting data – to me, that was perfection. It’s super exciting. As I started to work with Vincent, I started to see the product, or potential product at the time, and see the opportunity in the market. I wanted to partner with this company and bring my experience of running a SaaS company for more than 13 years to StellarAlgo. It’s always great to see a product that you’ve worked on that your customers want to use and see the difference it makes in what they do. And of course my passion is definitely data, which is the name of the game here at StellarAlgo. The opportunity to work on machine learning and model development, and seeing how the market is really hungry for that was a really exciting side of StellarAlgo. It’s not just about delivering reports through an application, it’s about working on the predictive analytics that ultimately result in major value for the organizations. That was a big draw for me, especially with my background.
What kind of advice do you have for organizations that have less in-house technical knowledge to get more out of their data?
For organizations of any size, step one is to make a conscious effort to start making data-based decisions, leveraging information and treating data as an asset. Define what data assets you have. Implement internal procedures that can help increase the quality of your data. If you are missing data that you need/want then make it a priority to move forward on capturing it. And you don’t necessarily have to have a team of five people to do that. Often times business users are the best individuals to consult in that regard, who can say, ‘this data is valuable to the company, and you can make a difference by doing something with this data.’ In my experience people are willing to take time to make sure that data asset is good if they know that it’s going to be used. Something that simple is so valuable for when you want to act on the insights based on that data. That’s the first place I would recommend to start.
One other thing to note is that you also don’t need to create an arduous task out of data collecting that might burden your sales and marketing teams. Keep it simple and make sure that data quality is there. It can be a beast, but it doesn’t need to be, and it doesn’t have to be expensive. There isn’t always a right or wrong way to start – it’s about what makes sense for your organization at this time.
Looking into your crystal ball, where do you see major innovation coming in the years ahead?
I believe that in this industry, major innovation is going to be around price point, specifically the price point of the technology we use. We’re able to bring more advanced technologies and patterns and responsiveness that, even though they exist now, are not yet cost-effective for organizations. That kind of innovation is making a difference. For the same amount of spend in these kinds of solutions, you will be able to get more out of that spend in the future. That could lead to more real real time data processing and feedback, which is always more expensive than batch processing. We would be able to tighten up that gap, starting with cloud providers, and then in the future incorporate newer technologies that will help democratize that kind of processing power. At StellarAlgo, we’re all about taking what we learned from before and applying that within newer cloud services. There’s things that we’re using now that didn’t even exist 10 years ago. Everyone’s on this pretty steep trajectory of evolution. Sometimes it feels like a revolution. Because we don’t have a crystal ball, we keep all of our architecture and options open so that as a SaaS company, we can embrace better patterns that come along to help us give our clients the best user experience at a great price point – from major leagues teams to minor league teams that may not have the same kind of budget. It’s something that we strive for everyday. The way we do things could be different in a year’s time. We assume that will be the case.
Thanks for your insights, Sean!