Speaker Contest winner Max Simmonds on why children’s data isn’t safe — sTARTUp Day - Most Startup-Minded Business Festival

Speaker Contest winner Max Simmonds on why children’s data isn’t safe

This year’s sTARTUp Day Speaker Contest brought together dozens of inspiring ideas, but one topic stood out above the rest. Max Simmonds, a UK-born engineer turned founder, won the public vote with a talk that exposes an overlooked risk inside millions of homes: the massive amount of data collected about children through everyday consumer devices. His message resonated with parents, technologists, and voters alike, earning him a place on the sTARTUp Day 2026 stage.

In this interview, Max shares his story from his early fascination with engineering and international career at ESA and CERN to the moment he realised how baby monitors quietly gather, store, and monetise sensitive data. He speaks candidly about what he discovered as a new parent, why children’s data is uniquely vulnerable, and how this realisation led him and his co-founder to build Purple Parrot, a privacy-first baby monitoring system designed to keep families safe.

Tell us a little about yourself. Where did you grow up and how did you end up in Estonia?

I grew up in a small town called Abingdon, just outside of Oxford in the United Kingdom. From a very young age, I was fascinated by building things. I spent most of my time in my dad’s workshop, fixing cars, motorbikes, and pretty much anything that was broken. I loved making things work and solving problems — that’s really where my passion for engineering began.

I went on to study electrical and electronic engineering at the University of Plymouth, which is known for having a strong engineering programme. During my studies, I did two industry placements that shaped my professional direction. The first was at National Instruments and the second placement was at CERN in Geneva, Switzerland.

After graduating, I moved to the Netherlands for a year. I met my now-wife at university — she’s Estonian and was finishing her master’s degree while we lived there. I worked at the European Space Agency, doing research and development on new spacecraft technologies.

When our daughter was born two and a half years ago, we began thinking seriously about where we wanted to raise her. We decided that Estonia would be the best place — it’s safe, family-friendly, and has a great quality of life. So when she was about a year old, we made the move. Around the same time, I transitioned from electronics engineering to software development.

It was actually our experience as new parents in Estonia that sparked the idea for our startup. When we were searching for a baby monitor, we realized how much personal and health data these devices collect and share, often without users being aware. That discovery led to the creation of our company — an effort to build privacy-focused, intelligent baby monitoring technology that keeps families safe without compromising their data.


What are some of the key lessons you’ve learned working with world-class institutions like ESA, CERN and companies such as Lloyds Banking Group?


Over the years, I’ve worked from small startups to large international organisations.

When I joined the European Space Agency, everything was large — the missions, the teams, the budgets, and especially the timelines. At ESA, projects could last years or even decades, and precision was everything. You spend far more time on research, validation, and documentation than you ever would at a startup. It taught me patience and the value of thoroughness.

One of the first startups I joined was Open Cosmos, which builds CubeSats —small satellites designed for rapid deployment in space. When I joined, there were only about 20-30 people in the company. It was exciting because I could literally touch the hardware that would later orbit the Earth. That kind of hands-on, fast-paced environment taught me the importance of adaptability and how to get things done efficiently with limited resources.

Then there was Lloyds Banking Group — a completely different world again. In finance, the main priority is risk management because you’re dealing with people’s money. Every change, every line of code, goes through rigorous review. It’s much slower, but for good reason. That experience gave me a deep respect for processes and security, especially in systems that can impact millions of users.


Your talk at sTARTUp Day 2026 focuses on how kids’ data is stolen and monetised. What first made you aware of this problem?

When my wife and I had our daughter, one of the first things we wanted to buy was a baby monitor. There are essentially two types: traditional ones that are not connected to the internet, and newer “smart” ones that are. Traditional monitors are pretty simple—they transmit audio and video directly between two devices and aren't accessible from outside the home—but they don’t offer advanced features such as monitoring the baby’s breathing or alerting parents if the baby rolls over onto their stomach.

At first, we were tempted by those connected monitors because of the extra safety features. But when I started reading through their privacy policies — initially just out of curiosity — I was shocked by what I found. The amount of data these devices collect, store, and share is enormous. By simply using a Wi-Fi-connected baby monitor, parents agree to give the company access to images, audio, environmental information, and even health metrics. These monitors continuously stream HD video and sound to servers, often located in foreign countries, where the data is potentially stored indefinitely.

To put that in perspective, the data generated by one connected baby monitor in a single year would be enough to form a stack of printed paper reaching halfway to the International Space Station. And that’s for just one family. For millions of users worldwide, the scale becomes almost unimaginable.

A baby monitor continuously captures everything within earshot — your conversations, the sounds of your home, your child’s movements. It’s never really off. Over time, this creates a detailed digital identity for a child before they even learn to talk. Research by the New Zealand government found that by the age of 13, the average child already has around 72 million data points connected to them — all of which can uniquely identify them.

It’s easy to underestimate the future consequences of such data collection. When Facebook first appeared, nobody imagined it would later be used to influence elections. Similarly, we don’t yet know how today’s massive datasets about children might be used in ten or twenty years.

There have been countless cases of baby monitors being hacked. One particularly disturbing incident in 2019 involved a hacker who took control of a connected baby monitor, played horror movie music through it, and spoke to an eight-year-old girl pretending to be Santa Claus. Understandably, she was terrified.

So, in short, I became aware of this issue because I was a new parent trying to keep my child safe, and I discovered that the very products designed to protect children were, in fact, putting their data at risk. Once I realised how deep and unregulated that problem is, it was impossible to ignore.


You’ve mentioned that by age 13, a child can have up to 72 million data points collected about them. Could you give us a sense of what kind of data this includes — and who’s buying it?

It’s mostly marketers and insurance companies who buy this kind of data, but in reality, almost anyone can. The way the system works is that the data collected from baby monitors or similar devices is aggregated and then sold to third-party brokers. Those brokers, in turn, resell it to whoever is willing to pay — which means even I, in theory, could buy it if I wanted to.

In Europe, we’re somewhat better protected under GDPR, but even that framework has loopholes. A common practice is to make the data “non-identifiable,” which typically just means removing the person’s name. The images or audio recordings might still exist, but because they’re no longer directly tied to an identity, companies can legally claim the data isn’t personal. Some baby monitor brands even state in their privacy policies that they don’t sell “facial data points,” which sounds reassuring — until you realise it might simply mean they crop out the face or blur it, which can be undone in some cases, while selling everything else.



Tell us about your solution — the “Privacy-first, offline, intelligent baby monitor.” How does it work, and what makes it different from existing products?

Our product is essentially a hybrid between traditional and smart baby monitors. Traditional monitors — the ones not connected to the internet — are great for privacy, but they’re quite basic. They let you see and hear your baby, and that’s it. For example, the monitor we currently use at home turns on when it detects noise above a certain threshold and streams the video to a handheld receiver. It’s simple and private, but it lacks intelligent features.

What we’re building combines the best of both worlds. Our monitor uses Edge AI, which allows video processing and AI algorithms to run directly on the device rather than in the cloud. In other words, the analysis happens inside your home — the data never leaves it. Thanks to advances in computing, what used to require a large, expensive server can now be done with compact in-home hardware.

This means parents can enjoy all the benefits of smart monitoring — real-time insights, movement detection, and health indicators — without sacrificing their family’s privacy.



What challenges have you faced so far — whether technical, regulatory, or market-related — in building a product that prioritises privacy over convenience?


We’re facing several kinds of challenges at the moment, and most of them are technical rather than regulatory. The algorithms we use for video analysis and monitoring are typically designed to run on large, powerful computers, but our goal is to make them work efficiently on small, low-power hardware that can sit safely in a child’s room. That means redesigning and optimising everything to run locally — trimming the models, rewriting the logic, and finding clever ways to achieve the same accuracy without relying on cloud computing.

Another challenge is keeping the system agnostic to individual children. Normally, AI models are trained on massive datasets, but it would go completely against our values to use thousands of images of real children to train the algorithm. So we’ve had to find alternative ways to make the technology work without using personal data. It’s a difficult problem, but one we’ve managed to solve fairly well so far.

Hardware is another area that’s tricky. We’d like to source as much as possible from European manufacturers, ideally even European-made chips. Unfortunately, most of the companies producing the kind of Edge AI hardware we need are based in China, which dominates this field. We’re still exploring how to stay true to our European supply-chain vision, but if the right components simply aren’t available here, we may have to compromise.

On the regulatory side, things are actually much simpler. Because all the data processing happens locally and nothing ever leaves the home, our product is inherently GDPR-compliant. The recordings belong to the parents — not to us or to any company.


At this sTARTUp Day we discuss the good life. What's a good life for you?

For me, a good life starts with having time for my family. Being able to spend time with the people you love — especially your children — is absolutely essential. Without that, life would feel much less meaningful. But it’s also about having something purposeful to work toward. If you spend all day doing nothing, it’s hard to feel fulfilled. I think a good life comes from combining both — meaningful work and meaningful relationships.

In my case, that sense of purpose comes from knowing that what I’m building could genuinely make the world safer for children. Data is becoming one of the most valuable commodities of the future, and we don’t yet know all the consequences of how it’s collected and used. So protecting children — both physically and digitally — feels like a mission worth dedicating myself to.

_____________

Maija Simmonds, Purple Parrot co-founder and Max’s wife


What do you see as the bigger vision for Purple Parrot? Where do you want to end up with the company?


Getting Purple Parrot medically approved is a major milestone for us. From the very beginning, our vision has been to bring this technology into neonatal care — into NICUs — to help protect the most fragile lives. We’ve always said that if our work could save even one baby, it would all be worth it. As the company grows, we also see opportunities to make an impact beyond infant care. One direction we’re exploring is developing similar monitoring systems for the elderly and their families. It’s a long journey ahead, but it’s one that feels truly meaningful.

What is a good life for you?

To me, a good life is when you feel safe in your world and free to make your own choices — that quiet sense of strength that comes from knowing you’re in control of your life, including your data. It’s also about having a strong community around you and a sense of purpose that keeps you moving forward. I have many goals and ambitions, but purpose runs deeper than that. For me, it means giving back — creating something that genuinely helps others and leaves the world a little better than it was yesterday. Without that, I couldn’t truly say I’m living a good life. Purple Parrot was born out of that very desire — to turn this purpose into something real and lasting.

__________

sTARTUp Day is turning 10 – now is the perfect time to grab your ticket at the best price! Join us in Tartu on January 28–30, 2026, for an unforgettable anniversary edition filled with inspiring speakers, valuable connections, and new opportunities. Get your ticket today and be part of the biggest entrepreneurship festival!


Articles you might also like:

Between Top Executive and Startup Founder: Rain Vääna’s Honest Comparison of Two Worlds

Rain Vääna, a man who knows exactly what it means to start from scratch, will take the sTARTUp Day stage. After more...
Read more

sTARTUp Day Classics: Milda Mitkutė keynote in 2020

In this episode of the sTARTUp Day Podcast, we travel back to one of the most memorable keynotes from sTARTUp Day 2020....
Read more