Active Inference & The Spatial Web

Web 3.0   |   Intelligent Agents   |   XR Smart Technology

Ten Years Later: Katryna Dow Reflects on Data Privacy & Decentralized Identity

Spatial Web AI Podcast

Watch on YouTube

Listen on Spotify

Ten years ago, on March 31, 2015, I hosted Katryna Dow, CEO and Founder of Meeco, on my former podcast Collaborative IQ to discuss an emerging concern: personal data privacy.

 

Katryna is one of the earliest pioneers tackling data privacy and personal digital identity, and her company, Meeco , “helps organizations build compliant, future-proof solutions that give their customers the power to access, control and exchange data on their own terms.”.

 

That episode ten years ago was titled “Let’s Talk: Retaining Ownership of Your Personal Data,” and in it, we addressed themes that, at the time, seemed prescient but also somewhat abstract to many listeners. A decade later, Katryna and I reunited on my current podcast, Spatial Web AI, to revisit these topics, examining how data privacy, digital identity, and decentralized architectures have dramatically evolved, often validating Katryna’s early concerns.

Early Warnings and the Cambridge Analytica Moment

In our original conversation, Katryna expressed deep unease about the unchecked collection of personal data and the potential for misuse—a sentiment powerfully validated by the Cambridge Analytica scandal just a few years later in 2018. This event became a watershed moment, demonstrating the tangible risks Katryna had warned about years prior: mass data harvesting, psychological profiling, and targeted influence campaigns.

Original Interview that aired on March 31, 2015 on the Collaborative IQ podcast

From Centralization to Decentralization

A significant pivot from the initial interview to today in 2025, has been the shift from centralized databases—the so-called “honeypots” for hackers—to decentralized architectures. Katryna described this transformation vividly, emphasizing Meeco’s dedication to decentralized technology since its inception. She highlighted how decentralization reduces risks associated with large-scale data breaches by distributing data control back to the individuals it belongs to. An important advancement in this area includes Meeco’s collaboration with DNP in Japan, leveraging decentralized identity solutions (DID) to provide secure, privacy-oriented data exchanges.

The Evolution of Regulatory Frameworks

The regulatory landscape has drastically changed over the past decade. During the initial 2015 discussion, privacy regulations were fragmented, limited, or even absent. Fast forward to today, frameworks like GDPR in Europe, Australia’s Consumer Data Right, and Japan’s stringent privacy laws represent profound progress. Katryna explained how these regulations, initially appearing ambitious or overly stringent, now serve as essential protections for digital identities and consumer rights.

Predictive Analytics to Intelligent Agents

One intriguing conversation thread from 2015 was the concept of predictive analytics inspired by the film Minority Report. Today, Katryna and I noted that reality has, in many ways, surpassed fiction. Predictive analytics have evolved into sophisticated AI-driven agents that can not only predict behaviors but assist actively in everyday decision-making. While discussing the benefits of personalized AI, Katryna emphasized the ongoing ethical and practical challenges in creating trusted systems that genuinely serve individual users, highlighting the critical importance of data consent and control.

Consent and Trust: The Heart of Privacy

The notion of consent has significantly evolved over the last decade. Katryna observed that while younger generations are increasingly comfortable  with what I referred to as, “living out loud” digitally, awareness is growing around the necessity of clearly defined consent and context-specific data sharing. She cited Apple’s incremental steps toward privacy by design, such as allowing users granular control over location-sharing, as examples of the progress toward more nuanced consent models.

 

Looking forward, Katryna suggested that the next decade will be defined by one central theme: trust. Trust, she argued, is foundational to the future of digital interactions—encompassing cryptographic verification, transparency of data origins, and ethical AI applications. Establishing robust digital trust mechanisms will be key to ensuring technology genuinely serves humanity rather than exploits vulnerabilities.

The Road Ahead: Challenges and Opportunities

Despite remarkable progress, the road ahead is filled with complex challenges, including geopolitical differences in privacy standards, rapid technological advances outpacing regulations, and the increasing sophistication of cyber threats. However, Katryna remains optimistic, driven by collaborative global efforts toward standardized decentralized systems and enhanced digital trust frameworks.

 

Reflecting on ten years of progress and pitfalls, this conversation I had with Katryna Dow underscores the urgency and excitement of navigating a rapidly evolving digital landscape. From early warnings to tangible progress, our dialogue is a compelling reminder of how quickly technology can evolve—and how crucial it is for individuals, businesses, and governments to stay vigilant and proactive.

See You in 2035!

Now the question remains: what technological shifts and societal impacts will the next decade reveal? Given the accelerating pace of AI and computing innovations, 2035 promises to present a technological landscape vastly different from today. Katryna and I plan to reconvene again in ten years to assess just how transformative the next decade will be and discuss the wild and crazy future yet to come.

 

To hear the full conversation and explore these critical themes in greater depth, tune into this special episode of the Spatial Web AI Podcast featuring my ten-year anniversary conversation with Katryna Dow. (Bonus: at the end of this current episode, I have attached the audio of our previous interview which originally aired on March 31, 2015.)

Huge thank you to Katryna Dow for being on our show!

Connect with Katryna Dow:

 

LinkedIn: https://www.linkedin.com/in/katrynadow/

 

To learn more about data privacy infrastructure, visit Meeco at https://www.meeco.me/

 

A dedicated space fostering an environment for learning, community, and collaboration around Active Inference AI, HSTP, HSML, and the convergence of technologies utilizing these new tools – Digital Twins, IoT, Smart Cities, Smart Technologies, etc…

The FREE global education hub where our community thrives!

 

Scale the learning experience beyond content and cut out the noise in our hyper-focused engaging environment to innovate with others around the world.

 

Become a paid member, and join us every month for Learning Lab LIVE!

Episode Transcript:

Speaker 1 – 00:00 Foreign. Speaker 2 – 00:13 Hi, everyone. Welcome to another episode of the Spatial Web AI podcast. Today we’re doing something really fun. Ten years ago, so this month, March 2015, I had the pleasure of interviewing our guest right now on my old podcast, which was the Collaborative IQ podcast. My guest today is Katrina Dow, and she is the founder and CEO of a company called Meco. And she has been on a mission for longer than a decade. Back, back when Minority Report movie came out, and we’re talking about the early 2000s, like 2002 or three when that came out. And it. In our conversation 10 years ago, she was describing how that kept her up because it just this idea that your data can be used in such a powerful way to. To kind of control what aspects of your life are. Are being used for various purposes. Speaker 2 – 01:19 So, you know, she has long believed that your data belongs to you and that you should control how it’s being used and who can access it. And so today we are coming back 10 years later, and so much has changed over this last decade, but her mission has stayed steady. And so I’m here today to talk to her about, you know, what she has been working on over the last 10 years, how this technology has progressed and what the future tech. Because I’ll tell you, Katrina has had this vision of the future long before the framework was there. But we’re at a point in time where the framework is here. And so I am just so excited to welcome you to our show today. Katrina, thank you for being here and welcome. Speaker 3 – 02:11 Thank you, Denise. It’s such a pleasure. And wow, 10 years. I mean, I feel like I just blink. And it was yesterday. In preparation for our discussion today, I listened to our interview from 10 years ago, and I joined you. I was joining you from Sydney early in the morning. The sun was coming up, and I can remember exactly where I was sitting and our conversation, kind of like it was yesterday. Today it’s the evening I’m speaking to you from Brussels in Belgium, and I blink. And 10 years have gone by and so much has changed and nothing has changed. So many of the challenges and the things that were talking about in terms of the weaponization or the lack of control or transparency around our data and our identity, those things have actually compounded. Speaker 3 – 03:11 But just as you’ve said in your introduction, what’s so different now compared to 10 years ago is that some of the key building blocks and the framework, the societal awareness, the evolution of technology is such that what sounded a little bit crazy 10 years ago is really entering the mainstream today. So I’m really looking forward to our conversation and being able to validate some of the things that have come to pass and ruminate on some of the things that are still off into the future, but I believe will unfold. Speaker 2 – 03:51 Yeah, And I mentioned Cambridge Analytica, the scandal earlier, because in our conversation 10 years ago, you were talking about the seriousness of this data collection and the potential ways that it could be used, as you said, to WEP as a, you know, weaponized against you. And. And then literally three years later, the scandal breaks. And it had started in 2016 with them using this, although they had been gathering the data since 2011. And so that’s just mind blowing because you were spot on with your concerns and everything else, although the public didn’t really start to get a sense of the power in that for yet another three years. Speaker 3 – 04:37 Again, I was prompted by listening to our conversation and, you know, were talking about the early days of what was the Facebook before it became Facebook. And I mean, Facebook was trying to do everything in those early days to get a combination of product, market fit on board, move from communities. As we know, it started on campus, and then it started to radiate out with friends and connections from campus, and then unleashed on the world. And some of those early algorithms or that massive data collection, data scraping was exactly what Cambridge Analytica developed, happened to kind of stumble upon. And it was everything were talking about because people were, you know, liking cat pictures and talking psychological profile games, you know. Exactly. Answering quizzes, sending each other on Facebook, hey, I know I’m this or that. Speaker 3 – 05:44 And, you know, it was like, you know, if. If were writing a sequel to the. To the book 1984, it was kind of like the world was in. Enrolled in this giant experimentation to fall into this combination of data collection, profiling. And then what we later learned was kind of how that data could be weaponized or used or applied to influence the way you think or vote or communicate without our everyday awareness of kind of what was happening. So were talking about the potential of that 10 years ago. A few years later, we actually. It’s, you know, it was already happening. It’s unleashed into the world. And, you know, they’re some of the things that we now, our technologists today. We’ve inherited many of those challenges in terms of how we can restore or build trust and transparency into our digital life. Speaker 3 – 06:57 Yeah, I mean, they’re really the challenges that we have today. Speaker 2 – 07:00 Yeah, absolutely. And I think the thing that people don’t really understand is that, you know, you’re not really, you’re not just painting a picture of your own data, you’re painting a picture of all of your relationships and connections to other people and then their data too. And I think that was the thing that was so surprising about the Cambridge Analytica thing is that Facebook, through their open graph, these people could build these apps within, which is what they did. And it not only profiled you, but profiled all of your friends. And then it had this community profile which gave a real psychological picture of everybody and who they’re connected with. And there’s so much power in that. Speaker 2 – 07:43 And I, I actually just something, I don’t know, I think it was like around last fall and it was that the meta glasses, Harvard students were able to just dox people by the glasses, take a picture and then their app was set to just scour the Internet to find other pictures and then find all the other details about them. And they could literally be walking up to someone and then start having a conversation as if they knew that person because they’re able to supply all these details, you know, that only someone familiar with them would know. And that’s an interesting aspect of where we are even now compared to then Minority Report like. Speaker 3 – 08:26 Yeah, and as you mentioned in the introduction, that was really what inspired me to start the company. You know, inspired me even 10 years before I started the company. But I, I think, you know, one of the challenges that we have right now, and particularly in the current sort of political landscape that we find ourselves in, sort of some of the geopolitical things that are happening is that technology, and particularly now with AI, it’s moving so much faster than either our ability as a society to understand the implications or to regulate or work out what the principles are or the ethical guidelines. And then, and then throw into the mix that we’ve sort of got these three geo powers emerging that are not necessarily aligned. Speaker 3 – 09:22 So we have in America a move really towards moving as fast as possible with AI and being able to commercialize and lead. We have China where really it’s the state that is really focused on the control. And then here in the eu, what we’re trying to do is to put the citizen at the heart of everything that happens with respect to their technology. And so kind of a rights based approach. And so there is this race on to build this digital infrastructure around the world with a very American kind of mercantile, a very state sponsored and then a citizen centric. And those things don’t necessarily blend well Together, they’re very different approaches. Speaker 2 – 10:11 So given today’s reality, the way you just described it, how does it compare to your expectations from 10 years ago of what you would have expected us to be in this moment 10 years later? Speaker 3 – 10:26 I think that the thing that was, I guess, encouraging for me to listen to our conversation again is that our North Star, the ability to enable everyone on the planet to get equity and value for the data they share, to be able to have the tools to access, control and exchange their identity, their data, that North Star hasn’t changed as a company. We had some big pivots. So I’d be the first to say were massively early 10 years ago to launch a BDC. Speaker 2 – 11:00 Yes, you were. Speaker 3 – 11:02 I think it’s like madness. Yeah. So were reading the landscape and were reading kind of the evolution of where things were moving. We were just way too early in terms of B2C. And so the first pivot, I would say for us a couple of years after our conversation, was to start looking at where are these core trusted relationships from a digital perspective? And it wasn’t hard to see that those things are really across financial services, health, your employer, your, maybe your airline, your favorite brands in terms of loyalty. Speaker 3 – 11:43 And so I moved, opened an office for us in London and spent a couple of years in every sort of enterprise incubator that we could connect into to actually understand what that problem was like inside the organization, around many of the things we discussed, loyalty, data access that really surfaced, the issues around cybersecurity, data rights, privacy, but also the really early days of the standards of community, which 10 years later, those things are maturing such that things were talking about 10 years ago are now about to be enshrined in EU regulation. So every European citizen resident by 2027 will have the right to what they’re calling a European Digital Identity wallet, built on the same frameworks and open standards that I’m very happy to say are incorporated in our technology, in our platform, similar to some of the things that are happening in America. Speaker 3 – 12:54 So one of those standards is the ISO standard around mdocs for the mobile driver’s license, which has been adopted by California. We’ve also got Apple and Google in many of the working groups and contributing to that. So what we’re starting to see is some of these technologies align and join up, but we’ve also got a population of more than 300 million people who are going to have access to those tools. So some things seemed kind of way off and crazy. And I and I realize now could not have happened without the evolution of frameworks, standards, common protocols, interoperability, and solutions that are scalable. Speaker 1 – 13:40 So. Speaker 3 – 13:41 So that has really helped what we’re doing unfold. Still some of the challenges, and now with the rise of AI, some of those things that were discussing as challenges a decade ago are kind of just like they’re on speed now. So it’s. How do we use this evolution of this new infrastructure to actually either design the digital world that we want or combat some of the issues which include massive rise in fraud, cybersecurity, and again, everything from identity theft to how easy it is to impersonate somebody online and, you know, all of the impacts that can have on somebody’s physical life as a result of being able to, as you say, scrape the digital life and kind of act like you just know that person. Speaker 2 – 14:42 Yeah, yeah. It’s crazy. I mean, the amount of fraud that attempts that happen daily. I mean, I get texts every single day that are, you know, and it’s crazy because I know my phone number’s just out there in the dark web, like, you know, well, and. And this last year, too, a few of the data houses, right, that. That literally house all the data, like huge data compilations on every person. And it’s a lot of really sensitive information, but every one of them in America, like three different really large data houses were hacked last year. And so it’s like at this point, then everybody’s data is out on the dark web. And it’s not just like your phone number or your address or whatever. Speaker 2 – 15:28 It is literally histories of relationships and all of this data that has been scraped and compiled to assemble these compositions of who you are. Speaker 3 – 15:42 Yeah, I think that’s possibly a good segue into what the challenge was. And were only sort of hinting at this a decade ago. We talked a little bit about personal clouds, but it’s the architecture. I mean, the reason that those large data fiduciaries can be hacked is legacy architecture. Yeah, Databases. Pretty simple. Once you’re in, you know, you have access to a table of records and a honeypot, basically. And what we have seen over the last decade, and, you know, there’s still resistance, but we’re seeing more and more organizations recognizing the value of decentralized architecture. Yeah. Or when it comes to personal data. And very proud to say that, you know, this has been our architecture from day one. Speaker 3 – 16:37 Being able to encrypt and assign keys to an individual user on an individual basis and then further assign a unique identity to each piece of data. And so if somebody is careless and shares their password or shares their keys or is not aware of the value of kind of securing their data, you know, that’s awful. But it’s not everybody that is able to be accessed and all their data to go at once. And so I think this is one of the really important shifts that the decentralized world is enabling is this ability to put better protections in from a security perspective and also to limit the harm to everyone of kind of a single entry or a back door. That, that basically means, you know, you get one record, you get them all. Speaker 2 – 17:44 Yeah. So how would you characterize kind of this evolution of privacy then? Because I know we’re, I want to talk about the decentralized aspect and the importance of decentralization, but over the last 10 years, do you feel like privacy has become an issue that people are more concerned with? Or do you think that people have gotten so used to not having any control over data that they just, they don’t know that there’s an alternative? Like what are you seeing with your company as far as challenges there, especially when you’re dealing with, like you had said, you know, banks and other institutions that you’re working directly with. Speaker 3 – 18:24 So I think again I’m speaking, my focus right now and I’m talking to you from Europe is really on the geographies that we’re working on where privacy is a really important part of the value proposition. So we still have a team back in Australia where our headquarters are, and since we last spoke, Australia has moved forward what they call the consumer data. Right. So the Australian government has said, look, consumers, citizens have the right to understand who has their data, how it’s being used, when it’s being used. I think there’s still an evolution to happen in Australia with the right tools to enable that. And I’m really proud to say that we’re involved in a number of projects specifically around cross border collaboration that we can talk about towards that. Speaker 3 – 19:21 But that’s one example of where it’s a combination of privacy regulation and consumer rights here in Europe. Over the last term of the European Parliament, which ended last year, is we saw significant wave of regulation post GDPR. So GDPR only came in after our initial conversation 10 years ago. But since then we’ve seen the Digital Markets Act, AI act, the Data Governance Act. So we’ve seen quite a lot of complementary regulations that are designed to put further protections into a European’s digital life allow them to exercise more of their rights. And at the heart of all that regulation is always privacy. And then the other place in the world that I’d be delighted for us to talk about is Japan that also has really strict privacy regulations. Speaker 3 – 20:17 And we’re doing some really wonderful and interesting work with a new partner in that part of the world. So I’m a little biased because each of those regimes recognize either citizens, consumers, customers, patients, students, employees, all have some type of right to privacy or right to actually understand how their data is being collected and used. I know in other parts of the world or in the United States, you have fragmented regulation. So you have some states like California that came out, you know, strong and front and center to say, you know, Californians have additional rights around, you know, how their data is accessed and used. Speaker 3 – 21:03 But I think what’s very difficult, particularly if you’re developing a product, is really difficult if you don’t have clarity of that overall regulatory regime or what the rules and regulations are that can be incorporated into a technology solution. You know, if you’ve got a fragmented market, it’s really hard to design something if by state. By state, the rules are completely different. Speaker 2 – 21:28 Yeah, 100%. And I think it’s interesting because you see regulatory bodies like, you know, the GDPR and the different rules that have come into play and with the way frameworks are structured right now with the World Wide Web and the centralized authority over data transactions. I remember. So 10 years ago, you were talking about, back in 2011, you had built a manifesto around this, and in that, it was literally describing how each person should be able to form their own rules over how their data can be accessed and used. And I think now we have these rules over. You can’t. You can’t scrape people’s data and use it without their permission. You can’t do these things. But that doesn’t give the power to the user to be able to control it being used for good or for things that they want. Speaker 2 – 22:26 We talked a little bit a couple of weeks ago about the spatial web protocol. So I know we’ll talk more about it today. But those frameworks, that’s coming, but we’re still kind of in this interim where now there’s rules and guidelines about not abusing someone’s data, but there’s still not the facilitation of the power. And I know you have been working on that for the past decade. Speaker 3 – 22:49 Yeah. And I think the big difference now is that some of those building blocks. So as you said, there are emerging protocols and Frameworks because all of us, you know, will have a digital twin of our physical self, but also a digital twin of anything physically we own and, or any entity or role that we have in our life. You know, there’ll be a digital equivalent. I know we want to explore that idea, but what’s different now is that we’ve had this recognition that centralized database honeypots are a target for hacking and fraud. And so this legacy architecture is not the way to manage large amounts of personal data. We have the rise, emergence, growth and maturing of decentralized architecture capabilities. Blockchain, a distributed ledger. And so we’ve recognized that there are ways to start designing systems that are more secure. Speaker 3 – 24:04 We have the emergence of common ways of organizing data about me, about you, and being able to exchange that and sign it cryptographically and issue it from a trusted source. So we now have verified credentials where a lot of that work has come out of the W3C, the decentralized identity foundation, the Open Identity Foundation. Speaker 3 – 24:28 So really amazing work over this last decade for global collaboration to come up with a range of standards so that when I share data about me with you, I can do that via a trusted source that might be my government in relation to my identity, or it could be from my health care practitioner, or it could be my employer, have it cryptographically signed so that when you receive that and you verify that, not only do you know that you can trust the source of that data, but you can be confident that data hasn’t been tampered with in some way. So we didn’t have those things 10 years ago. Speaker 3 – 25:09 We had the ideas around the value of that exchange, but we didn’t have the common ways to ensure that an application we at Mika were building could be interoperable with other technology platforms, which is critical. The interoperability piece is really critical for achieving scale and interoperability across different regimes. So whether or not that’s across border or across up from financial services to health to employment, to travel, without those common frameworks, standards approaches, it becomes almost possible to scale out a real solution. Speaker 2 – 25:59 Yeah, right. Well, okay, so a couple of things there. One of them, it’s funny because you and I, we caught up a couple of weeks ago to kind of catch up a little bit and talk about what we would do for this call. So in that conversation were talking about contrast between frameworks today versus where were 10 plus years ago. And one of the things that you said in this conversation a couple of weeks ago, were talking about The Spatial web protocol. And were talking about the fact that this framework, the technology is allowing for not only digital twinning in space, but also over time, and the power that, that gives for being able to set, you know, expirations on data and permissioning and everything else. Speaker 2 – 26:50 And you told me in this conversation that was part of your plan way back when, but it made you sound crazy. Speaker 3 – 26:56 I know. We actually coded that capability in. So, as I said, at the core of how we’ve designed our APIs, our backend applications, is always on these three core principles. The ability to access data, the ability to be able to control it, and the ability to be able to exchange it. And I know that there’s been a lot of work and a lot of debate around, oh, you know, we should be able to control your data and kind of lock it up in a vault and, you know, throw away the key. But the reality is that whether or not it’s booking an airline ticket or working remotely ordering pods for your coffee machine, everything we do is digital. So this idea that data is going to be exchanged in some kind of context was critical. Speaker 3 – 27:48 So one of the things that we decided on really early days, two foundation principles. Privacy by design, security by design. Privacy by design basically means the standard setting is off and you decide to turn it on. So the idea of consent or being informed about where the data is going and why it is going is core to everything we do. Security by design, again, it’s okay. How do we make sure that we’re always minimizing any kind of attack vector. And part of that is making sure that everybody has individual keys and we’re not building this kind of centralized, monolithic database of data, a layer below that. We always thought, okay, when you share or exchange data, duration is really important. Being able to set some rules around who has access. How long do they have access to that data for? Speaker 3 – 28:52 Is it for an hour, a week, a month? Is it while ever I’m a customer or is it the duration of this trip? And I think what we’ve seen is that Idea seemed crazy 10 years ago, but we’re even seeing now. I mean, Apple has really helped raise the awareness by you being able to decide whether or not you want to share your location perpetually or just while you’re using the app. Yeah. So I think there are many things now that are starting to help people understand that sharing can be really contextual and time bound. So I think, and I hope this idea of time bound or context will be a critical way for how decentralized, particularly decentralized identity and data infrastructure solutions develop. Speaker 2 – 29:46 Now, I know that, I think you had mentioned it was back in 2018, I believe you and your company wrote a paper that was centered around zero knowledge proofs and this zero trust architecture. And so I’m really curious how that plays into what you were just describing. And with the security that you see for protecting people in that way, that’s. Speaker 3 – 30:10 Probably a great segue into what we’re doing in Japan as well. So in 2018, I guess that was. So after spending a couple of years looking at what all the problems were sort of within enterprise, we wrote an extensive white paper which is called zero knowledge proofs of the modern digital world, Access control, exchange of data. So were looking at what does a modern digital life look like, what is that like for a family? And then we zeroed in on a really high value use case which is commonly known as KYC or know your customer or customer onboarding. And then we went further to say, you know, what would it look like if you could build a network of trusted actors? So it might include financial institution, it might include a telco, it might include government. Speaker 3 – 31:06 But instead of you and I being passive in that network through what we call in the paper the API of me or this idea of this identity wallet, that we could actually be co equal actor in that network. And so what were talking about, again, it was still a few years before we had clarity around verified credentials and the formats. But what we talked about in that paper was essentially what is commonly known now as a decentralized identity credentials and the ability for either a zero knowledge proof or a binary way of sharing important data without handing the data over. So the simplest example of that is. Is Denise over 18? Yes or no. Speaker 3 – 31:56 So rather than sharing your date of birth or giving away really important personal information that might be on your passport or your driver’s license or on your birth certificate, we could just really simply say, yeah, over 18 or over age. And so then all of a sudden, whether or not it’s drinking, you know, alcohol, entering a nightclub, hiring a car, getting your insurance, there are different age gates that either lower the cost of certain things or minimize the risk to the service provider. So what we have today is the embodiment of that paper. We’ve bought that paper to life. And I’m really excited to say last year we entered into a strategic partnership with a major Japanese multinational DMP and they at the end of last year launched a decentralized identity network in Japan based on our technology. Speaker 2 – 32:58 Awesome. Speaker 3 – 32:59 All of things in that paper, sort of, you know, how could we build these applications and enable the modern digital life to be supported? Privacy secure, clarity of consent, decentralized architecture. I’m super thrilled that we’re actually working with DMP on a daily basis to bring those use cases to life. Speaker 2 – 33:24 Congratulations, Katrina. You have been so far ahead of your time and that I think that’s why I really have so enjoyed our friendship and knowing you and having these conversations even back then, because I, I recognize that in you the minute we met, that, you know, you’re. I feel like we’re. We’re similar, maybe even like, you know, tech soulmates. And so I, I love it. And it’s fun for me to hear the successes that you’re having because I know that this is, this has been your. This is your purpose in life. You know, how some people, you can recognize they’re on a mission, they have a purpose. And so it’s really fun to me to see how it’s playing out and how. Speaker 3 – 34:14 Denise, thank you. I really appreciate that. And for anyone that’s listening, this last 10 years, boy, these have been the most challenging 10 years of my life. I mean, the fact that were kind of way too early conceptually, and also to the B2C market, yeah, I’m. I have so much gratitude and I’m in awe of the amazing team that we have, and we still have some members that were with us right from the beginning. You know, I get to work with the smartest, most inspiring, amazing engineers and people every day. And what I think has just been, I don’t know, luck or due to my stubbornness or perseverance, is that we’ve been able to evolve. Speaker 3 – 35:09 The company never move our North Star, but this idea that we wanted our technology to be used within a network where individuals could actually participate in a very transparent way and be confident of being able to do everyday things in their life. And how that’s evolved is kind of different. You know, again, when I listen to our conversation, many of those ideas, I was thinking, oh, wow, that’s really inspiring. We should still do that. Or I wonder if we could do that. But the end game, in terms of where were heading, what we wanted to achieve, that hasn’t changed. But, boy, has it been hard. You know, we’ve had, you know, two or three times over this last decade where it’s been, okay, guys, you know, we’ve got six weeks and it’s all over. Yeah, less. Speaker 3 – 36:06 And then all of a sudden, you know, the, Unless, you know, fell into place. And so, you know, for anyone listening and if you feel that you want to embark on the entrepreneur’s journey, first of all, you know, my commiserations, if that is something you just feel you have to do, because, you know, if you have to do it. But, boy, does it come with just this barrage of challenges. And, and it’s just you kind of get through one and then, and you think, wow, you know, how did we as a team navigate that? Technically, legally, commercially, whatever it is? Wow, you know, we’ve solved that. We’re really confident of product market fit, or we’ve, you know, we’ve now got this partnership in, you know, this particular sector. And then all of a sudden it’s something else that comes out of the blue. Speaker 3 – 37:08 And so it’s, you know, there but for the grace of, like, we’re still here a decade later, but every day there’s, there are new challenges. Speaker 2 – 37:23 Yeah. And it, and I think too, I think one of the hardest things is when you’re truly stepping out as a leader. One of my favorite quotes, and I have no idea who said this, but I’ve held on to this for probably the last 15 years, is that a leader leads before anyone is following them. And that’s a hard place to because you really are stepping out ahead. So you’re speaking a language that people don’t understand yet. And not. So not only are you pursuing a goal because you find purpose in it, that it, that is meaningful. And, and then you’re talking about it, but people don’t have ears to hear yet because they have to keep hearing it. Speaker 2 – 38:09 And, you know, so you’re not only going after your goal, you’re having to explain it to everyone and you’re having to really bring people along with you. And that takes time. It takes time to set up, and it takes time to activate. And, you know, so I, I, I applaud you for that because what you have done has not been easy. Even, even aside from the entrepreneurial challenges that, you know, entrepreneurs face, you truly are, you have stepped out as a leader from day one. And, you know, I just, I think that’s wonderful. Speaker 3 – 38:44 Thank you. Look, I think one of the challenges with that is it can feel really lonely and isolating, but so much less so now than a decade ago. I think, you know, a decade ago, I could walk into a room and not our conversation because you would meet, you know, fellow like minded people on the journey and that’s what was so lovely about our connection. But usually it was, you know, I’d go and pitch to a bank or I’d go and you know, speak in an event. And I was kind of the crazy person like, you know, who cares about personal data or digital identity or what’s she even talking about or you know, how would that even be possible? Speaker 3 – 39:30 And you know, there’s no way that as a bank we would, you know, we would ever consider that, you know, and I’m really proud to say now, you know, part of the reason I’m based in Belgium is that our technology is incorporated in three of the retail brands here under the KBC bank umbrella. And KBC bank has consistently run one. Best banking app in the world and best banking app in Belgium. And so they are actually using our technology to give their customers control over their data and to be able to exercise their digital rights and share data securely with each other. So 10 years ago, speaking to a bank, it was like, oh, there’s no way we would ever do that. Fast forward today that is happening. So the ability for those ideas to be more mainstream I think is possible. Speaker 3 – 40:25 But to pick up on your point around kind of where the magic is, I think that for me is what is so thrilling about the partnership with DNP in Japan. Because in Japanese culture, first of all, Japanese just have an amazing reputation in terms of creating technologies and being able to look at how to make things efficient and beautiful at the same time. Speaker 3 – 40:55 Yeah, but there’s also culturally a view that is much further into the future than I think certainly what I’m used to from an Australian perspective, an Australian enterprise perspective, you know, people are thinking quarterly, they might be thinking of their two year plan or their three year plan, but when you allow yourself culturally as an organization to be thinking of decades into the future and to kind of imagine, then this kind of magic that comes in, and I think that’s what I find so empowering and exciting about our collaboration is that we can have conversations that seem like they are somewhat science fiction and then say, well, how would that be possible? How would that be possible? How would that be possible? How would that be possible? Speaker 3 – 41:50 And then what that happens is it ends up as a roadmap component that we’re working on this year, even though we know that where we want it to be is maybe way off into the distant future. So I think having the ability to Collaborate and work with partners that have the same vision or a desire to think really big or think about the future. That’s just. That’s awesome. Speaker 2 – 42:21 Yeah, yeah, that’s wonderful. And, you know, and I, I feel like that’s, we’re gonna see more and more of that. People are going to start really understanding what the, what this next era of computing is bringing, you know, and the importance of these things and they’ll get more of a kind of a future scope look, you know, I think right now a lot of people are just nervous with what’s happening because we are in this kind of mindset shift of a lot of things happening, especially around AI. I do want touch, though, on the decentralized aspect. And, you know, back 10 years ago, were talking a lot about the centralized platforms, you know, Facebook and Netflix, and, you know, and the way they gather the data around you and how it really compiles these different perspectives, Persona or different attributes of your Persona. Speaker 2 – 43:16 And, and that the blurring of those is not always ideal. Right. You know, you want to be able to compartmentalize, you know, who you are in personal life, who you are in work. Now, I wanted to ask you because we talked about Minority Report then. Have you seen the show Severance? Speaker 3 – 43:36 It, I think no. Do you know what? I started to watch it and I, I’m, I have to confess, I’m a binge watcher, so I cannot wait. I can’t start something if it’s. Speaker 2 – 43:48 I know. Speaker 3 – 43:49 And then I’m not only just a binge watcher, but I’m kind of like a back to back binge watcher. So what I’ve been waiting for, I started to watch it and I thought, you know, what I’m going to do is I’m going to wait now for season two to be complete so I can kind of do back to back binge. Speaker 2 – 44:07 Yeah, I haven’t watched it either. Speaker 3 – 44:10 To Australia. Speaker 2 – 44:12 I haven’t watched it either. But I’ve had so many people telling me, you have to see this. And I understand the concept. The concept is basically, I mean, your brain remembers things when you’re at work about who you are at work. The minute you leave work, that shuts off. And now it’s who you are in personal. And it just, it’s so funny because it made me think about the conversations that we’ve had and. Yeah, so obviously that’s a kind of a dystopian take on this segmentation. Speaker 3 – 44:46 The dystopian thing. Like I Mean, you know, we were talking about some potential harms a decade ago. You know, if I look 10 years into the future. And you just mentioned that sort of, we’re kind of at this, you know, evolutionary place where, you know, where we’re moving from sort of centralized architecture to decentralized. We now have common ways to be able to securely access and share information. We recognize that digital identity is in everything. Whether or not it’s proving that I’m a person or I’m an entity, I represent an entity, or I’m a thing acting on behalf of a person. But without getting super dystopian, we’ve got some really tricky years ahead because we’re still in this kind of crossover. Technology’s moving faster than regulation. Speaker 3 – 45:46 We’ve got some really strange things happening geopolitically where there are very different views about the role of government and society and the role of the citizen and the rights of the citizen. You know, we’ve got differences in different parts of the world. I think there’s an opportunity for some really bad stuff to happen over the next few years until there is either more awareness or there are the right protections that are put in place. And I think that has to be on three levels. It’s technology evolving to make that possible. It’s the legal or regulatory or rights based environment so that we can say, okay, from a digital society, these are the rights we have in the physical world. Why shouldn’t those rights be identical in the digital world? And so part of that is around legal. Speaker 3 – 46:45 And then the other thing is, and I know we talked about this a couple of weeks ago and we caught up is the commercial, you know, money value, you know, is the way we will continue to transact only going to be through fiat currency, you know, dollars or digital currency, or the tokenization of everything where we may exchange value for a combination of money, loyalty, carbon, off set, you know, where we now have the ability tokenize everything and then set some business rules around how we exchange value. So even with monetary policy, we’re also in this state of flux. You know, there is, there’s been positive response to the US saying that they will now start a crypto reserve. There’s been backlash to that. Speaker 3 – 47:40 I personally think where we’re going to end up is with digital currencies replacing physical currencies and some type of digital reserve that will be held. I don’t know whether we’re going to have this conversation 10 years from now or 20 years from now. I’m not sure how long that’s going to take, but what it’s clear is that our monetary policy is also part of that evolution. And so we’re in this kind of murky place where all those things are kind of working themselves out to how they will evolve into the future. Speaker 2 – 48:15 Yeah, and that’s one of the things that’s so interesting to me about the spatial web protocol, because it’s hyperspace transaction protocol. So it is a transaction protocol. It enables the kind of the decentralization and convergence of all emerging technologies across networks. And what you were describing earlier about this kind of digital twinning of your identity, this digital twins spaces and all entities within spaces, so people, but also places and things and nested ecosystems and the programming of interrelationships between those things and their nested spaces. But that the other side of it is now instead of transactions taking place under a centralized authority, whoever owns a web domain now, transactions are at every touch point. You can program permissions and credentials and everything at every touch point because all these entities become nodes in the network now. Speaker 2 – 49:22 So when you’re talking about the decentralization of everything and then the ability to, you know, tokenize, when you think of tokenization and you think of like NFTs, non fungible tokens. Right. It’s just this, it’s this digital signature around ownership of things and how they relate to each other and how they evolve over time. It becomes this value exchange system. Speaker 3 – 49:50 Exactly. Because it’s not always ownership, although it could be ownership, proof of ownership, and then proof of transfer of ownership. It could be proof of control or just the right to operate that tokenized asset in some way. So that could because I am the CFO of a multinational company and I’m, I’m, I can cryptographically prove through my identity and my rights, the ability for me to operate any tokenized asset on behalf of that organization. So that could be moving money, that could be moving physical assets, that could be moving digital assets, or I could be a content creator that’s creating something where I have the ability to assert those rights as to whether or not it can be used for training data for AI, or whether or not I’m asserting my rights as an artist, a creator, creating music, creating content, creating this podcast. Speaker 3 – 50:56 And so I think what we’re going to start to see is cryptographic signature and tokenization and traceability to be able to say, you know, this is really Denise’s podcast. Yeah, you know, this conversation actually happened. I was in Zurich two weeks ago at a conference, dice. And a shout out to the team at dice. They did a fantastic job. We were looking at the rise of sort of ecosystems around decentralized identity. And a couple of guys said that they’d experimented with taking a white paper and having AI turn it into a podcast. And then the AI agents started to hallucinate and just make stuff up that wasn’t in the paper. And it sounded completely legitimate until it kind of fell off the edge. Speaker 3 – 51:51 And so, you know, I mean, you can’t see here, but in my office I have three little koalas with the Hear no evil, see no evil, speak no evil. Because I remind myself every day, just because I see it in the digital world doesn’t mean that it’s real, that it hasn’t been created. I want to know, I want to be able to say, how do I know that this is authentic, where it came from and what are the rights associated with interacting with whatever this is? Speaker 2 – 52:26 Yeah, and I, I feel like that can’t come soon enough because, I mean, you know, the main issues that we’re having with, you know, we’re using all of these emerging technologies in the most unsecured environment of the World Wide Web. And so we have hacking, tracking and faking, or surveillance capitalism. It is such a problem when there’s like a faked video or a faked, even faked phone calls, right? Faked voice audio. And you know, the average person, you cannot tell if it’s real or not. So it’s so important to be able to have that authentication to the content or whatever it is. Speaker 2 – 53:07 And, and I tell people like, you know, when we do, you know, when this protocol starts to expand the Internet capabilities into kind of this digital twinning of everything and the security layer that’s in there, you’re still gonna have these things, but it’ll be obvious that’s what it is. Right? And, and like, the whole faking stuff, it’ll become more of a parody because it’ll be obvious that’s a fake. Right? There won’t be the question you, of a person’s integrity or identity of did they really do that? You know, it’ll be really clear. Speaker 3 – 53:48 You need to get there faster. Denise. And I think the problem is that, you know, one of the challenges that we ask ourselves every day is how do we take our technology and connect it into some existing infrastructure with the least amount of Interruption. Yeah. So that’s part of the reason we decided to focus on an API platform and the infrastructure to sit inside an enterprise and connect into existing processes. Whether or not that’s customer onboarding or a loyalty application, or the ability to check into a hotel or prove your identity to an employer, it doesn’t matter. But the challenge always is how do you make that as simple as possible to fit with the existing infrastructure? Why? Because if you come with this big bang wholesale, you have to change everything, then the change takes just too long. Speaker 3 – 54:52 So it’s about trying to find what are these incremental ways of introducing these additional tools capabilities, Security fraud management, layering those in and helping an organization over time evolve towards this. Because the world you’ve just described, we need to be there today. Speaker 2 – 55:14 Yeah. Speaker 3 – 55:14 You know, we able to open up, you know, blue sky and immediately know. This video I’m looking at from Reuters, you know, it’s actually Reuters. Yeah. It hasn’t been created by someone else, you know, AI. Exactly. And so I think they’re the, they’re the challenges for us to be discussing now that we didn’t even really see a decade ago. Speaker 2 – 55:39 It’s so true. It’s so true. Okay, so a decade ago though, we talked a lot about predictive analytics because Minority Report, really, that was, you know, and then with all of the massive creation of data through all these interactions that were now digital and that’s only grown and mobile, that exploded. And now we’re going to, I mean, just with what’s about to happen with the spatial web, the number of nodes in the network is exponentially expanding. Right. So that just creates more data. So then you have predictive analytics. So were we in our assessment then, and then 10 years later, are we where you thought we’d be? Speaker 1 – 56:30 Oh, it’s worse. Speaker 3 – 56:31 It’s worse. And look, I remember, I mean, one of the things that you were very focused on when we spoke was kind of the future of work. And what we, you know, what we didn’t recognize, you know, we. There was no sort of sense of global pandemic. Yeah. And I think there was a really interesting report. I think it may be McKinsey’s. They published in the first year of COVID I think it was around sort of eight months in, where they were already seeing data that in terms of digital interaction, a speed up of technology adoption of kind of seven years in the space of sort of eight months. Yeah. Speaker 2 – 57:09 Because it was a catalyst for sure. Speaker 3 – 57:12 Yeah. Here’s a QR code. Here’s how it works. You know, here’s when I have to hand over data, here’s what that QR code will allow me to do or not allow me to do. Here’s a building that I can get in or I can’t. Here’s what I’m authorized to work on remotely. And, you know, you were talking about the future of work, but one of the things that we didn’t imagine, would it be that it would be some of these kind of black swan events that would just actually speed up sort of technology adoption and then back to the idea of kind of analytics. And it’s becoming even more difficult with a completely distributed workforce because there’s just this massive blurring line now between what’s personal, what’s professional. Speaker 3 – 58:04 You know, when I’m, when I’m surfing the web and I’m not at work and five minutes later I’m opening my laptop and I am at work. And so part of, you know, this melting pot we’re in also is the way we work, the way we live. That has changed so much in the last few years. Certainly post Covid and yeah, you’re on. Speaker 2 – 58:26 All the time in our lives. Speaker 3 – 58:28 It’s very hard to separate the roles in our lives. Yeah, yeah. Speaker 2 – 58:32 And, and it’s funny because, you know, I, I, unless I’m traveling, I’m working out of my home, and if I’m here and I’m just sitting here half the time, it’s like, okay, I have things to do, I need to work. Or, you know, we’ve gotten to this point where we can leave messages for each other 24 hours a day. And so then it’s like, you know, what’s the expectation on response time? And usually it’s immediate. You know, I know, you know, a lot of people have been trying to establish boundaries around that. Right. I saw, I saw the best meme and it was literally somebody’s out office message. You know, they were on vacation and it was like, if this is an emergency, take a deep breath. Because few things really are. I just thought that’s brilliant. Speaker 3 – 59:24 Well, again, you know, on that topic, because things are, you know, so 24 7. We’re also seeing, this is certainly something that’s happened in Australia. I know that it is already existing here in parts of Europe. Is this right to disconnect, that employees will actually have the right after a certain time at night or on weekends or on vacation, to actually be able to disconnect because the blurring, you know, the blurring is contributing to the ability to profile because we are kind of digitally connected 24 7. But it also means that we don’t enjoy so much of that ability to disconnect and kind of think and create and solve problems because we’re just constantly being pinged. It’s just this constant being bombarded. Speaker 3 – 01:00:32 And so I think one of the things that we will see and I hope may come out of the way work is evolving is finding what the right balance is between being on and being off enough to be able to solve problems. Yeah. And I don’t care what you do. I don’t care, you know, if people are listening to this and saying, oh, that’s fine if you work in an office or you work with technology or. But, but I think no matter what we do as humans, we just, we all need that space where we can decompress and feel like we can think. Speaker 2 – 01:01:11 Yeah. And, you know. Yeah, I’ve talked a lot about the fact that, you know, there are just times where I just, I can’t with my phone because it’s just too much, you know, and you feel overwhelmed by just the amount of noise that is bombarding your headspace constantly. And I think people have good practices. I know I’ve heard of a lot of people going, don’t even pick up your phone for the first hour. You’re awake in the morning. You know, I find that extremely difficult. Speaker 3 – 01:01:41 You know, my phone in my head before I’m awake. Speaker 2 – 01:01:44 I know I really reach over to. Speaker 3 – 01:01:46 My nightstand with my eyes closed. Speaker 1 – 01:01:48 I’m. Speaker 3 – 01:01:49 I’m not a good. Speaker 2 – 01:01:50 It starts with what time is it? And then it’s like, oh, God. So, yeah. But okay, so regarding the predictive analytics side, how do you see AI? You know, I mean, you’ve seen what’s happening with current AI and then we’ve talked a little bit about this kind of next era of AI that’s adaptive and self organizing. It’s also explainable and it’s programmable, you know, with human laws and guidelines, things like that. So there’s an AI that’s coming that can provide this kind of a safer path to collaboration between human and machines. So it’s not this dystopian picture as the only viable path. Right. Like there’s, there are pathways that we can really use this to the benefit of humanity and how we evolve as a species, which I really, I believe that that will play out. Speaker 2 – 01:02:52 But when you’re talking about predictive analytics and data and then how these agents will be aware of it and helping us and really becoming partners in our life, what do you see evolving with that and what are the things we should be really, really concerned about or protecting? Speaker 3 – 01:03:15 So I, I, I go back to our conversation a decade ago because I think were still on the right track. We didn’t have the right language. We weren’t talking about AI or extended intelligence or sort of algorithmic agents or any of those things then. But I think on a positive note, the ability for you to have to be able to access control your data and then decide which agents you want to open that data set up to for decision making was really, you know, if you go back to the manifesto that I wrote in 2011, that’s what that manifesto was all about. Yeah. Being the best source of truth or having the ability to provide data that is real time, in context, accurate with intent. Yeah. Speaker 3 – 01:04:14 So, so if I have data that is accurate, it’s timely, it’s contextual, so that an agent can actually help me do something that I intend to do, whatever it is, book a flight, change jobs, go to the doctor, you know, whatever it is, then the exponential ability for me to make better decisions or manage my life more effectively. I mean, I think that’s the plus side that, you know, there’s so many potential benefits. And that’s why I’ve always been a big proponent of the exchange aspect of data, the value exchange. Not selling your data, not monetizing your data, but making sure that your data is helping inform or guide access control or some sort of outcome. Right. Speaker 2 – 01:05:11 Well, and even to the point of personal insights, one of the things that stuck with me after that was Super Tuesdays because it has literally over the last 10 years has made me feel so much better about Monday. Speaker 3 – 01:05:29 I have to say I’m still a Super Tuesday person. And for listeners. Denise asked me a question 10 years ago about. I was sort of a guinea pig for some of the insight gathering around our early prototype products. And your question was whether or not I had learned anything about myself. And I just had no idea that for some reason I have this kind of superpowers on a Tuesday unlike any other day of the week. So the kinds of things that I managed to do, evidenced by my digital life on a Tuesday, certainly compared to a Monday, but also compared to any other day of the week. And so it’s also made me feel less guilty on a Monday because I always know Super Tuesday is coming and that’s when I’ll make up for anything I haven’t done the day before. Speaker 2 – 01:06:23 Yeah, and I think it’s important that we do learn these things about ourselves and we give ourselves the freedom to work within our own personal leanings. That gave me insight into, okay, what are my patterns? You know, and the funny thing is I tend to work on the weekends and I used to feel really guilty about taking a day off or something during the week. Right. Like, it’s like, well, people are expecting things from me, but it’s like I’m working on the weekends and I tend to be really productive on the weekends. So yeah, take some downtime during the week and don’t feel bad about it. Speaker 3 – 01:06:59 I think this idea of helping to surface insights and decision making is really the promise of personalized AI or sort of this agentic era or agents being able to provide, you know, services or execute tasks on your behalf. A few years ago I, I was very privileged to be part of what we hoped would be developing a standard for IEEE for a personal AI agent. And so the concept was that this AI agent would only ever have access to your data and act on your behalf. Speaker 3 – 01:07:40 And part of the reason that weren’t able to finish that body of work was that there were so many principle based questions, ethical based questions, of what does it actually mean for an agent to work on your behalf, what framework, what’s the decisions if it comes to things like health or exercise, productivity or parenting, and you know, what makes something for your benefit versus working against you. And so I’m very proud that what we did publish as a result of that body of work was some foundational principles towards personalized AI. But I think when that, when were doing that work, which is maybe four or five years ago now, things were still moving way too fast and too fluid to be able to say, okay, you know, here is a concrete approach for being able to do that. Speaker 3 – 01:08:40 I think there’s more, particularly here in Europe, there’s more regulatory clarity around AI that would enable you to design those things. But I think the foundation principles of who the agent is, who’s providing the agent and evidence that it is working on my behalf, that’s really the trap for, in the consumer space going forward. Yeah, that transparency that if you hand something over to an agent, it actually can demonstrate it’s working on your behalf. Speaker 2 – 01:09:15 Yeah. And one thing that’s really interesting that I’ve noticed over time and probably really, since mobile. Right. The explosion of mobile. And you see the power of all of this social inter. And, and everything else is that I think our ideas around consent evolve. And, and I say that because I, you know, like, if you look back on my profile pictures on Facebook, my first profile picture was a landscape. I, I think it was like a. Yeah, it was a landscape. And, and that was back when, like, don’t put your picture on the Internet, you know, and I mean, look where we’ve evolved to now. And I actually remember around. Speaker 2 – 01:10:03 It was probably around 2012 or 13, you know, a girlfriend of mine, we hadn’t seen each other in a few months, and she was one of those friends where, you know, we just, we click off of each other. She’s hilarious. When we’re together, we’re just, it’s laughs and we were catching up on, you know, the last few months dating different things, and it was just hilarious stories. But I had this flash in that moment of, what if this is the future for, like, reality tv? Right? I mean, when we get to the point where brain computer interface, are we going to be able to turn channels and find channels that are open in people in our networks and in it, you know, and you would think nobody would be that transparent. Speaker 2 – 01:10:51 But then look at the state of social media today, and you have people who are conservative. Like, like, for me, I see it as a way to kind of give my friends and family a sense of what I’m doing and, you know, who I am and my life. So they feel connected. But my most private things and a lot of the things I’m really going through aren’t there, and I don’t put them there. But then you have people who are really living out loud. And I feel like the younger generations that have literally grown up with technology in their hands since they were infants are more prone to this, but they’re living out loud and they don’t have that privacy. Understanding the way we do consent is a whole different thing. Speaker 2 – 01:11:37 So when you’re talking about data privacy and you’re talking about this evolution of consent, how do those. How do those coincide? Speaker 3 – 01:11:46 I just, in listening to you kind of describe, you know, a possible future, it sounded like a fusion between neural link and Twitter spaces. Yeah, this idea that you could actually surf into kind of somebody else’s mind. Speaker 2 – 01:12:04 Back then, I was like, I need to make a YouTube channel on this. This would make a great show. But. But I also knew it was super early And I would even tell like a few of my friends who were in tech, I’m like, wow, I had this idea like, you know, and they’re like, that’s crazy. Speaker 3 – 01:12:20 I’m, I’m a little divided on the kind of, the living out loud because again, I think this theme that we’ve been talking about, you know, in looking back 10 years and kind of looking to the four and looking forward, word is, you know, 10 years ago we were not at a point of evolution. Yeah, like it was really early days. Things were just developing. Whereas now I think the big difference is that we are in this kind of state, this massive state of change. And so one of my favorite lines in a song is from Led Zeppelin’s Stairway to Heaven and it says, you know, in the long run there are always two paths that you can go down. But in the long run there’s always time to change the path you’re on. Speaker 3 – 01:13:14 And I, I think the interesting thing is going to be what will happen over the next couple of years because there is evidence and evidence coming from young people saying, look, I, I don’t want to be on 24 7, but I, I, I can’t be off unless everyone is off. And so there in schools to make sure that children don’t have access to smartphones during the day. There are some interesting studies that have been published where young people have said, look, if all my friends did it, I would be happy to do it, but I can’t do it if it’s just me, but I would willingly opt in. Speaker 3 – 01:13:56 So I think we are, we’re kind of at this, you know, two paths we can go down between where we may find that 10, 15 years of studies of social media and the harm and that it is doing to young people, to the way they develop, to their self esteem, to their mental well being. You know, maybe 10 years from now the conversation was we’re going to be having is a little bit like, you know, cigarettes and looking back and going, oh my God, can you imagine that? You know, we used to do this and the harm and impact that had on children and developing minds and self esteem and well being that in a child’s hand 24 7. So I really hope with the two paths that there is action that will be taken. Speaker 3 – 01:14:56 I mean Australia has recently taken some action around age gating social media books that are being published. There are studies that are now coming out and I think, do you. Speaker 2 – 01:15:08 See a change in the predatory nature of the algorithms of these apps. Speaker 3 – 01:15:13 Look, the problem is I’m not on a lot of those platforms. I mean, with kind of the whole Elon Musk acquiring Twitter. I mean, I used to be a big Twitter user. You know, I switched to Blue Sky, I don’t know, a few years ago, and I use Blue sky every day, but I don’t post anymore. Yeah, I mean, I really love the product and it’s beautiful the way it’s evolving and the growth and the content and the user base. You know, it’s really coming along. It’s fantastic. I really love it. But I just don’t post anymore. Speaker 2 – 01:15:47 Yeah, yeah. Well, so, okay, so you just made a prediction for something we may be talking about 10 years from now. Because I want to do these anniversaries. I, I feel like 2035, what is everything going to be like then? Speaker 3 – 01:16:07 How old I will be in 2035? It blows my mind, really. Speaker 2 – 01:16:13 I know. Time is such a weird thing. Speaker 3 – 01:16:16 Yeah, 2015 felt like it was just yesterday. So maybe 2035 will be the same. Speaker 2 – 01:16:22 Yeah. Yeah. So what do you have? Like, if you had any predictions for where you see us in 10 years and then we can talk about it in 10 years? Speaker 3 – 01:16:33 Okay, I, I, I would use one word and you know, it would be a whole other podcast to unpack it. But I think this next 10 years is going to be focused on trust. Trust in many forms. You know, whether or not it’s cryptographic trust, you know, being able to know that something hasn’t been tampered with. Issue a trust that content or credentials or recommendations have come from a reliable source that you can trust verification, trust that if you’re part of a network and you’re using a particular technology infrastructure that as the verifier of any kind of credential or content, you know that you are mitigating risk and fraud by being able to trust that infrastructure. Speaker 3 – 01:17:24 So I think the whole evolution of our digital life, our physical digital life coming together, I think if we get that right, it will be this trust equation, how we are able to move into a digital space where we feel that we can trust the tools we use, where information is coming from and how we rely on it. And I guess really that’s my mission for the next 10 years, to contribute to building out, enhancing and developing that digital trust. Speaker 2 – 01:18:10 Well, I love that and, you know, I, I’m looking forward to picking that up in 10 years on another interview conversation. And obviously we don’t have to wait 10 years to do this again. But these decade, you know, kind of check ins are really interesting because there’s such an acceleration with all of this technology now. You know, it’s the greatest time ever of scientific and technological discovery. And so it’s, it’s kind of nice to have these 10 year glimpses and predictions. So Katrina, I am just, I love you. I, I just so thrilled to have you here and I’m so grateful to know you and thank you so much for coming on my show today and having this conversation with me and gosh, I really can’t wait to see you again. Speaker 3 – 01:19:04 Huge thank you, Denise. Big thank you for asking me 10 years ago and very grateful to be able to continue that conversation with you a decade later. So thank you for your generosity and taking the time to craft this conversation and enable us to be together. So very much my pleasure. Speaker 2 – 01:19:28 Oh, and how can people reach out to you if they want to connect? Speaker 3 – 01:19:32 So Mo me Mwco me is our website. I think if you want to inquire, I think it’s Infoico. There’s a, there’s a way that you can do that via a website. As I said, I’m on Blue sky but I, you know, I lurk and read but I don’t post or LinkedIn. Yeah. And I think I’m just atrina dow with a K and a y D o W on LinkedIn. That’s also another possibility. So via Miko, our website or via LinkedIn. Yeah, love to hear from people. Speaker 2 – 01:20:10 Okay, great. And I’ll put all that in the show notes too. And thank you again, Katrina. It’s been such a pleasure. And thank you everyone for tuning in and stay tuned because after, as soon as this ends, I’m also pasting in the audio for our 10 year ago conversation was audio only. So I’m gonna attach it right at the end of this. So if you’re interested and you want to give it a listen, just keep tuning in. All right, thanks so much. Bye everybody. Speaker 3 – 01:20:40 You’re listening to Collaborative iq. Speaker 4 – 01:20:51 Hey everyone, welcome back to our show. You’re listening to Collaborative iq and I am your host, Denise Holt. And I am thrilled to be able to bring you today’s show. Today we’re doing something a little bit different. If you’ve been listening to the past few episodes, you know, I’m involved with a project on hacking the future of work. Hashtag new way to work with IBM and Pure Matter. And today’s Topic fits right into this rabbit hole of the future. Today is all about data as individuals. If you could retain control over who uses that data and how it’s used, wouldn’t that shift your perspective? I think up until now, we’ve all just kind of assumed that we give it away because we have no choice. Speaker 4 – 01:21:39 My guest today is going to explain to you that her company is on a mission to regain personal sovereignty for us as individuals and consumers. I’m so excited to have Katrina Dow here with us today. She is the founder and CEO of Meco, a company that believes that your data and info belong to you and that you should be able to control the picture sure. Your data paints of you and retain control of that data’s value exchange as more of a partner in the transaction rather than just a customer. Speaker 4 – 01:22:15 We will discuss how your data should be viewed by you as a form of currency, how you can benefit from the insights that your own data can give you about yourself, how controlling your own data will force brands into offering you a better customer experience, and the emergence of personal terms and conditions of data exchange, the personal terms and conditions that you set forth. And we’ll also talk about how do you get the younger generations like Gen Z to care about their data. Katrina had some fascinating ideas of what this will mean toward the empowerment of these younger generations and the empowerment of all of us, really. So let’s get to it. This was actually one of the most fun conversations I’ve had in a while. I totally geek out on topics like this and I really hope you enjoy it too. Speaker 5 – 01:23:09 I’d like to welcome to our show Katrina Dow. She is the CEO and founder of Meco and Katrina is coming all the way from Sydney, Australia today. Katrina, welcome. So good, though. It’s so good to have you here. Speaker 1 – 01:23:25 Good morning and thank you for inviting me. And I realize it’s not morning where you are, but I am just starting to see the sun rise and come flooding in through the window. So it is a beautiful sunny morning here in Sydney. Speaker 5 – 01:23:40 Wow, that is so nice. And I love that I could be talking to you halfway across the world. And we have lots of fun things to talk about today. So, Katrina, your company fascinates me. It is all about personal data and why we should care about it and how we can protect our data. Why don’t you tell us a little bit about your company and kind of the concept behind it and how you came about conceiving that concept and what it is. Speaker 1 – 01:24:15 Okay, sure. So first of all, Meeco is a platform or a series of apps or applications for you to start to manage your life in a really easy way. In the digital connected world, there are a few things that we believe very strongly that data and information, personal information is really valuable, that you have a right to privacy, for you to be selective about how your information is shared. And most importantly, if somebody has information about you, that you have a right to understand what that information is and how it’s used. Speaker 1 – 01:24:57 So if I go back to where it all started, I often joke, you know, it started with Tom Cruise keeping me awake a lot at night, because over a decade ago, I saw the film Minority Report and I love sci fi and, you know, I’d been as a little girl, I was a real Star wars fan. And, you know, I wanted to grow up and be Luke Skywalker. And so, you know, and I remember seeing films like Gattaca, and I don’t know what it was about Minority Report, but there was that scene with him, if you remember, running through the mall where he was being tracked by all the advertisers and things. Yeah. Speaker 1 – 01:25:42 So, you know, Lexus was featured in the ad, which I’m sure they paid a lot for to be in the film, and American Express and, you know, he’s on the run and you’ve got all of these ads coming at him with all of these call outs with, you know, John Anderson, you should do this. And how long since you’ve had a holiday? And what about a driving experience? Experience? And I’m sitting there in the cinema, completely enthralled and thinking, is this the future? You know, is this what it’s going to be like when you. When you duck out of the office to do an errand? Speaker 1 – 01:26:23 And all of a sudden, you know, you’re driving along or you’re walking through the mall and there’s just this content constantly coming at you based on, you know, your past history, the things that you’ve done, maybe permissions you’ve given. And I don’t know what it was, Denise, but just that seeing really kept me awake for a long time. And so I kept thinking, what is it that I could do about that? And it was interesting because around that time, this was over a decade ago, really early days of social platforms starting to get up, people were using MySpace and then along came the Facebook. And I remember all my friends saying, you’ve got to get on this. And I was watching the behavior of friends and the stuff they were starting to share and post, and I was thinking, wow, this is really interesting. Speaker 1 – 01:27:21 If we keep doing this and we start to live our inside lives outside, will there be a point sometime in the future where it changes so that we start to really realize how valuable all of that is and we might want to be able to control it or have some way of kind of turning the dial up and down? And so for about a decade, I was the person at every party and with friends that was saying, hey, I think this is going to switch sometime in the future. And I think our personal information is really valuable and the context is even more valuable. So that’s where the idea began. Speaker 5 – 01:28:08 Wow, that’s fascinating. And you know, it’s funny because Minority Report is one of those movies that really had an effect on me too. And we’re there. I mean, with predictive analytics and all of the customization based on preference and location and everything that’s just coming down the hype right now, as far as catering this one to one experience for us, it’s really fascinating because I think about that movie all the time. So it sounds like it birthed a really awesome avenue for you as far as a career path. Speaker 5 – 01:28:48 And I am fascinated by your company and I’m just, okay, so when we think about predictive analytics, a lot of people, I think a lot of people are really unaware of the kind of data that’s being mined around them anyway as far as their personal data and what they truly are giving away, I think they’re really naive to the picture that is being painted about them to all of these brands. And then also I think that people like to have that personal one to one, catered to experience. So how does that weigh with MECO as far as getting people to understand the importance and the value in all of that data? Speaker 1 – 01:29:39 I agree with you, Denise. I think that vision that was featured in the film, I really think we are there now. We’re not there in maybe that completely space age cool way that the film portrayed, but we are really there in lots of ways around what we talk about as silo predictive analysis. And so this is one of the things that we really excited about at meco and we think where the disruption from you and I as individuals and your listeners, this is really where there can be significant social change and for the emerging generation, real disruption in terms of the way business and commerce and working with government can really be different. And the reason we say that is so much of the analysis that’s done right now is based on a silo. Speaker 1 – 01:30:40 So whether or not it’s what I Buy from Amazon and how Amazon sees me, or what I may do with the government department, or what I may do with my bank, or the, you know, the fact that I’ve just consumed House of Cards on Netflix, you know, one episode after the other. So we have all this silos in our lives, and what happens is those organizations get a real sense of who we are in that context. And one of the things is, you know, I really love Netflix, but it’s such a small slice of who I am as a woman, as a person. You know, it doesn’t necessarily reflect who I am as a professional or who I am as a friend, or it’s not giving you context around my fitness or my health or my financial well being. Speaker 1 – 01:31:42 And so one of the things that I really wanted to achieve with Mika was this idea that when we wake up in the morning, we are really the central character in our life. So it’s not that we wake up in the morning, oh, I’m so happy that I’m a Spotify customer or a Netflix customer or a Chase Manhattan customer. We wake up in the morning and we think there are things we want to get done in life, or there are experiences we want to enjoy, or we need help, we need advice, professional advice or assistance. And so we tend to think that things fall into categories of advice, experience, services, products. And we may have favorite brands, but first and foremost, we think about how those things fit into our life. And the idea with Meeco is it’s not about silos. Speaker 1 – 01:32:38 It’s this holistic view of who you are. And then you can start to decide where information from one part of your life might be relevant in another, or more importantly, where information from one part of your life needs to stay in that area. So if you look at what Apple has been doing, for instance, with HealthKit, with the application they’re making available for the latest versions of iOS, and in particular with the Watch, you see that they’ve got some really strict guidelines around how that data can be used and shared. And so I think we’re starting to see some really large organizations. I’m pleased to see Apple coming out very strongly about privacy to understand that it really needs to be contextual. And I think context is probably the most important word for anyone working in digital media or social marketing right now. Speaker 1 – 01:33:42 What is the problem? I’m getting information, but do I understand this? Information and data in context? Speaker 5 – 01:33:52 Yes, I agree with that 100%. And, you know, you just brought up an interesting point and it makes me wonder because we’re embarking in this future where bring your own device to work is becoming very normal. And when you are having your own private device and you are basically signing into your company applications and things like that, there is a blur of privacy right there. You are going to want to be able to keep certain aspects of your digital Persona very private from your work. So does MECO work in that respect too? Speaker 1 – 01:34:33 Well, look, I don’t want to jump too far in the future, but I can share with you that one of the things we’re working on right now is a bring your own cloud to work. So we think we need to go beyond the device. And we’ve got some early stage prototypes that we’re working with, actually piloting with a large company that realizes that they had a bring your own device policy and they realize that exactly as you’re describing, that may give them access to a lot of information and data that really belongs to the employee. Speaker 1 – 01:35:11 So we’re in the early stages of looking at what it would look like for you not just to have your own device, but to have your own personal cloud so that you could switch between the applications they want you to use without them tracking what else may be on your device that is personal. Speaker 5 – 01:35:30 That is so cool. Speaker 1 – 01:35:33 Yeah. And one of the things that we’re. And again, it’s really early days, but they have some very highly specialised technical employees and we’re looking to see whether or not we can help them actually give insights to their employees that may actually strengthen their resume by being able to substantiate things. So not to say I worked at this place and I have these particular skills, but to be able to say, you know, for these many hours in this environment, I did this highly specialized thing and I can actually prove that. So we’re really excited about we’re doing there. Speaker 5 – 01:36:12 You’re blowing my mind right now. You’re completely blowing my mind. Speaker 1 – 01:36:19 Well, one of the things talk about and you know, we’ve been. I was at a banking conference this week in Australia called Next bank, which was looking at what the next, you know, the next century of banking could be like. And it was interesting. One of the people that presented said, you know, the next generation of bank is not changing the furniture. You know, it’s actually about understanding how a financial organization or an insurance company fits into our life and helps us get things done. And one of the things that we shared during the conference is we think the hallmark of a future proof organization will be its willingness to share the data it collects about its customers directly with its customers for mutual value. Speaker 1 – 01:37:11 And whether or not you’re talking about and it’s employer or you’re talking about your bank, if that information is being collected and it’s collected in a silo, I think we’re going to see great opportunities for the organisations that go, you know what, we have to find a way not necessarily to give back everything because banks will have legal requirements and hospitals will have legal requirements to take care of information. We understand that, but there’s this blurry line where questions are being asked without context and if that’s shared and then there is this mutual collaboration, I think we’ll start to see some really exciting things that could reinvent customer and citizen engagement in ways we’ve never seen before. Speaker 5 – 01:38:06 Yeah, you know, it’s interesting because that brings to mind what’s happening with just in the world of science, with what’s being termed as citizen science, where because we’re able to tap into such a large sample group of people globally, scientific research is just blowing up exponentially. But I feel like the people who are the research subjects are largely just giving it away for free. And it sounds like you’re on the cusp of just empowering people in general with their information. And that’s the thing I love about your company is because up until now any data mining, it’s only been to the benefit of the brand or marketers or anybody who wants to take that and profit from it. But you’re actually empowering people with their, you know, in regard to their personal data. Speaker 5 – 01:39:04 And to me that’s so awesome and I think it’s so necessary with the future we’re about to embark on because of all of the things you’re referring to, just how much data there is in every aspect and how it can really help us as people to grow and become more productive. So I just honestly. Go ahead. Speaker 1 – 01:39:29 Sorry, I was just going to say I think grow. I just want to pick up on that word. Denise. I think one of the things that we talk about is that having some way to control or manage your personal information is a direct pathway to better decisions for you individually. And so our mission is to make you the most reliable source of data and information and insight about you so that if an organisation, a government, a research organisation wants or needs that information, then the best place to come is you. And we know that there are sort of start ups that are emerging around this whole new category. And I just want to share some financial insight that might be interesting to your listeners because people say to us all the time. Oh, but does it really matter? Speaker 1 – 01:40:26 I’ve got nothing to hide and is my data worth anything? Anyway, we often say, look, it’s not so much that it’s about what it’s worth from the point of view of selling it’s what it’s worth in terms of the insight and the better decisions that it can help you make in life. But if we just look at the value of this emerging market. We were looking at a report done by one of the leading large consulting global consulting firms over the last few weeks and they estimate in Europe alone that the personal information economy by the year 2020 will reach around 1 trillion euros of value. Now, the exciting thing is that they were talking about organizations like MECO that will actually help $670 billion of that flow directly back to individuals through some value. Yeah, that’s very cool. Speaker 1 – 01:41:33 This is what excites me because, you know, I keep saying to our team, maybe it’s game over already for me and my generation, but when you think about young people that are growing up with all of their preferences, the stuff that they’ve been doing socially, the fact that their total education is going to start online or in some digital way, really part of our mission is to help that emerging generation have an asset which is every bit as valuable as money that they can have from a very early time in their adulthood to change the way that they interact with organisations, institutions, government. Because when you think about the collaboration economy, we’re going to collaborate in really different ways to the way we do right now. And money is just one small part of it. Speaker 1 – 01:42:36 It’s really not the way the entire world is moving. Speaker 5 – 01:42:41 Right. I totally agree with that. It’s interesting because we are facing a future of automation and a lot of. I’m working on a project right now with IBM and Pure Matter, where I was one of 32 VIP influencers that was brought to New York late last fall to be part of a think tank, a think a thon that was basically charged with the mission of hacking the future of work. And basically it was like, here’s the picture of what it will be like in 2025, that’s only 10 years from now. And here’s where we are now. How do we bridge that? And it’s because so many things are going to be changing. Work as we know it will not exist. So it will be very interesting. But I think you’re right. Speaker 5 – 01:43:32 I mean, just the whole entire monetary system, or at least currency will be different. There will be different value traits for currency, not just money. Speaker 1 – 01:43:44 Absolutely. We believe that. Sorry, we believe that data is a form of currency and that it will be contextually exchanged for different forms of value. So we understand money is value, but value status is value. Triage, that’s value. So you imagine you turn it up at a with your child and you as a parent have been protecting the data and information about their medical history. And in an emergency situation, you’re able to access the information that could save your child’s life. You’re not going to bargain or barter over the use of that information in that moment. But if you have that context and you can provide that in a way in exchange for an outcome which is optimal both for the hospital or the emergency centre, but most importantly for your child. Speaker 1 – 01:44:54 And so we think this idea that people go, oh, but is this just about selling data? No, it’s not. It’s about saying I will see value in different forms at different times in my life and I want my data to be able to act as a currency or an enabler for me to access those different forms of. And sometimes it might be, hey, bank, you know, you keep asking me these things and keep sending me these really stupid offers that don’t match my life in any way. I’m happy to license some of my data to you and about me, but in return I want to start seeing some tailored offers. So you know, we see all of those are possible. Speaker 5 – 01:45:43 Okay, so I have two questions. Speaker 1 – 01:45:46 One of them is just on what. Speaker 5 – 01:45:47 You were, what you just mentioned with getting a better customer experience. So we’ll go back to that. Remind me because I definitely want to ask you something about that. But really quick, before we do, I just want touch on what you were talking about, the younger generations. And you know, it’s interesting because I had not thought about it from the aspect of what you described as far as from the get giving them this value trade in their information, setting them up from the beginning to where they actually have something of value in all of their personal information. But the question I have with that is, you know, like Gen Z, right? Speaker 5 – 01:46:23 You know, which is Basically anybody under 20 right now, you know, 19, 20 year olds are kind of on the cusp of millennials, but they’ve grown up like with MySpace in sixth grade, you know, and then Facebook and then like my daughter, she’s 19 and I think her generation, they don’t really understand why privacy is even an issue. You know, grown up, living out loud. And I’m just wondering how will you reach them. Because also, younger people tend to not care about these types of things until they get a little maturity under them. So how do you let them know that this is really important? Are you going to target parents? Are you, are you like, how, what’s the approach with that generation? Speaker 1 – 01:47:10 Look, I think the interesting thing is, and you know, if you target parents, it’s like targeting around anything that you hope that your child isn’t doing or shouldn’t be doing. We really got to find the language that resonates for the generation that has the context. And there’s a couple of things I’d like to share with you on that. So, first of all, we have Aviva, our intern, who’s just come back from spending a bit of time in Berlin. She’s just turned 20, and she is my go to person. If we’re working on a concept, we run it past Aviva because she’s either going to say, hey, that’s really cool, or I don’t get it, or why would she do that? So, first of all, we see that’s really critical to have in our team. Speaker 1 – 01:48:05 This week in Australia, I met Martin let’s, who’s actually the founder of a company called In Trendwolves in Berlin. Sorry, not Berlin, in Belgium. And his organization just specifically looks at the trends that are coming way from into the future from an emerging generation. And he and I had some really great discussions about this because. Speaker 5 – 01:48:35 What’s the name of that company again? What’s the name of that company? Trend. Speaker 1 – 01:48:42 What? Trent Walls. And I can send you some, I can send you Martin’s details. But what’s really interesting, and I’ll include. Speaker 5 – 01:48:51 It in the show notes. Speaker 1 – 01:48:53 Yeah, great. So what’s interesting about what they’re doing is they are looking at what is the language that’s being spoken, what are the metaphors, and what is it that is actually speaking to the hearts and minds of the emerging generation? And I think the idea of saying, oh, you need to be private. It’s not going to work. But at the same time, it’s really interesting that, and I’m sure you see this with your daughter. Kids know how to be private and hide things from their parents. You know, they understand how to master things so that they live this dual life, the life with their friends and their life with their family or their teachers or any kind of authority. Speaker 1 – 01:49:41 So for us, we’re saying, okay, how do we tap into that skill they already have, which is around contextualizing their life and deciding who sees what and Help them to understand that they’re actually sitting on something, that it has real value for them in their life, and that they actually have some power as young people that certainly my generation didn’t have, but could be really meaningful for them in changing the conversation. So instead of hitting them over the head and going, oh, you shouldn’t do this, what we’re working on is, hey, you are a generation that is able to really shift the conversation. Speaker 1 – 01:50:22 If you start to understand the value of what you have in the same way that past generations have understood the value of land or money or title, you’ll be able to walk into a bank or an insurance company and you’ll be able to run some personal analytics, your own algorithm over your own, and calculate the value that you represent for a brand for the next 10 or 20 or 30 years. And you will have the power to negotiate things at a really early stage of life whereby an organization, say, like a bank, may not take you seriously until 10 or 15 years later when your career is established. Maybe you have children yourself, you have a mortgage. You have become, from the bank point of view, the perfect customer from a product perspective. Speaker 1 – 01:51:18 And what we’re saying to the emerging generation is because data and analytics will be able to predict your value in the future, if we can give you that capability, imagine how you can change the conversation with your government, about voting, about your city, about public transport. Speaker 3 – 01:51:39 And. Speaker 1 – 01:51:40 And this, Denise, is really what excites us about engaging the emerging generation. Speaker 5 – 01:51:46 Yeah, I love that approach. And, you know, that’s actually really smart because the way you can get any teenager’s attention is to empower them. And you’re taking that empowerment straight to them. And I think that’s why they love Snapchat versus something like Facebook. I think Facebook is for their public Persona, the very masterfully curated just for family and whatnot. But Snapchat is them doing all the things that their parents really think they’re not doing, but they’re sharing it with their friends because they think it’s going away and they think they have some control. So I think you’re right. I think really showing them the control this gives them over their lives is key. I think that’s so smart. Now, really quick to go back to the other idea that you were talking about. As far as the customer experience. Speaker 5 – 01:52:49 On the video on your website, it says everyday preferences create value for the brands we interact with, but it doesn’t always translate into a better customer experience. So your company, the fact that you’re empowering people with their data, it could cause brands to really have to pay attention and focus more on the customer experience they provide. And to me, it’s kind of like how social media itself gave consumers a voice and it really changed brand messaging from being outbound to inbound. So I feel like the empowerment you’re giving is kind of doing that, but on a much more personal level. So why don’t you talk a little bit about that and maybe let our listeners know exactly how it can affect it? Speaker 1 – 01:53:39 Sure. So on the marketing side, our goal is very specific, to be able to give you the power to own your preferences, to have the insights that you need or would like to make better decisions. And we believe the way to start with that is to help you understand that the segment of one view, the view of you to tailor information, offers marketing experiences is really valuable. And if we can help you create that and own that, then not only will that create value in exchange with the brands or businesses that you interact with, but it becomes a really powerful tool for you to understand how that context may or may not translate to other parts of your life. Speaker 1 – 01:54:29 I think one of the things that I should address right now, because listeners may be thinking, oh, but do we want a company like Meeco to have all that information? First of all, we have a legal constitution and framework that says your information is yours, your data is encrypted, we don’t read it, we don’t mine it, we act as the insights engine to give you the means to be able to draw insight or capture or share. And as I said, we’re also working on some interesting cool things that I can’t go into too much detail now, where the data can be anywhere you would like it to be encrypted and totally controlled by you. Speaker 1 – 01:55:20 So I think it’s just important to say that because the empowerment starts with, well, companies have this capability or companies like Google or Facebook give me access to cool products for free. But in fact I pay twice because maybe I do pay a subscription for something or I buy something online, but actually I’m also buying, I’m paying with my data. Because the whole data broking industry is multi trillion dollar industry and most people don’t realize that stuff’s being traded, the personal stuff being traded every single day. So first and foremost the data is yours. Speaker 1 – 01:56:05 Then once you start to understand your preferences and you are able to say to an organisation, look, this is really what I look like and I’m happy to share this with you, but the contract needs to look a bit like this, I need to know what you’re going to do with the data. I need to know if you’re going to share it with another third party. I need to know whether or not you can look at it on my device or in my personal cloud, as opposed to transferring it to your systems and then on selling it. Speaker 1 – 01:56:35 So what we see is that part of the exchange and the empowerment for the individual is to say, yeah, I’m happy to help you understand me as a customer so that you can serve me better, which is obviously going to reduce cost for you as an organization, however. But I have an equal. I’m a partner in this transaction. I’m not a customer. That is just being done to. I want to be somebody who is doing this together with you as a brand. So I also need to be able to say, this is how you use my information. Speaker 1 – 01:57:12 So what we’re going to start to see in this emerging generation and linking back to the future of work is we’re going to start to see individual context setting and we’re going to start to see the emergence of the personal terms and conditions so the individual actually saying how their information can be used. Speaker 5 – 01:57:33 That’s interesting because it’s almost similar to how we deal with copyrights. You can sell or exchange limited rights to the. And I had never thought of it that way before. And that’s a fascinating tool to be able to kind of wield in your favor. Now, I have one more question for you kind of as a closing question here, because I think I’ve taken up enough of your time. I feel like I could talk to you forever about this stuff, Katrina. It’s just so awesome to me. Okay, so the insights that people gather from MECO about themselves, about their preferences and their interactions with different brands and really their behavior, what do you think people would be most surprised to learn about themselves through these insights that they gather? Speaker 1 – 01:58:33 It’s interesting because I think, again, you know, we’re all different. We’re all fascinated by different things. I learned that I am. I’m kind of this Super Tuesday human being, which I didn’t realize it was so weird because, you know, I’ve been. I’ve been the Nico, you know, guinea pig and test subject for, you know, right from our proof of concept, you know, through, you know, every new feature that we add. And I have this thing with Mondays where I often feel like, why aren’t I really motivated on a Monday? And it’s often because I do work over the weekend. And then when I started to look at my Miko data. I went, wow, I am super productive on a Tuesday. I don’t know what it is. Speaker 1 – 01:59:23 All of a sudden on a Tuesday, I’m doing stuff online and the things I’ve been putting off and I’m answering emails and I’m paying bills and I’m booking things. And when I started to see that pattern emerging, you know, it just, it’s a silly thing, but it, you know, I wake up on a Monday and instead of feeling like, oh, you know, the week ahead and I have to do all of these things, I just think, hey, tomorrow’s Super Tuesday. You know, I know that I. Speaker 5 – 01:59:49 Tomorrow I kick ass. Speaker 1 – 01:59:53 Who cares? I’m just going to relax and kick back and enjoy Monday unfolding because tomorrow is Super Tuesday. And that insight is kind of silly in one way, but it really changed the context of how I approached my week. And then in the office we started saying, well, which day of the week do you see that you tend to be more productive? And it was different for all of us. So that’s one aspect around productivity where the platform is right now in the apps which are available on your iOS devices. So your Apple devices, your phone, your iPad, or we have a web application and it’s all synced and encrypted across a personal cloud so you can access across your device. So what we have right now are a series of applications. But what’s exciting is that we’re about to put into beta. Speaker 1 – 02:00:53 It should be a release in the next few weeks. The ability to verify your account, which will give you even more personal insights. The ability then to join contacts and then encrypted messaging so you can start to exchange and have conversations within the application. Then peer to peer sharing, where you can share things with the people you’re connected with, which then leads way to then sharing things with organisations. And the reason we’re doing it in that order is we really want people to start to feel that value with their partners and their family and their friends at a community level, and then have an understanding of how that may create value with a brand or an institution. Speaker 1 – 02:01:38 But we think it really needs to start in a personal way so that trust is established between collaborating with people you already trust, you know, your friends, your family, and then from there to understand the context of what or how or if you might want to share things with the wider world. Brands, businesses, the school. For us, we think it’s a journey and the way to start that is to start already with people, where there is trust. Speaker 5 – 02:02:17 Very, very cool. Okay, so people can, like you said, they can download the app in the App Store or Google Play, right, for Android. Speaker 1 – 02:02:30 No, we don’t have Android yet. I have to say we need a shout out from your listeners how fast we need to be getting onto the Android solution. So at the moment you can use the web version on android device, particularly if it has a large screen. We’re starting to get feedback from the community saying we want to see it in Android. So if your listeners are part of that shout out, then I’ll make sure that we’re listening to that and taking that back to the team. Great. Speaker 5 – 02:03:08 And I’ll be sure and include the website and everything on our show notes just for our listeners who are listening right now. The website is Nico Me. It’s meco, Me and Katrina. How can people reach out to you? What’s your preferred way of people getting in touch with you? Twitter. Speaker 1 – 02:03:31 Yeah. So either via Twitter and we’rekome or by Twitter directly with me, which is Atrinadow, which is K A T R Y N A B O W. All through Penelope Hogan, who was our content and community lead and she’s actually just this week launched a platform for our community, our users we call meeps for our meeps to provide feedback directly. And she’s looking for people to put their hands up if they want to be part of our alpha group that can actually collaborate with us as we’re designing features and before we roll things out. So if you have any listeners that are interested in actually collaborating right at the beginning of a feature development, then we would love to hear from people that have that interest as well. Speaker 5 – 02:04:35 That is so cool. That is awesome. Katrina, thank you so much for taking time out of your day to be here with us today. This has been one of my favorite conversations I’ve had in a long time. I’m a total techie geek girl at heart, so I love this stuff and gosh, I hope you have a wonderful day and a wonderful weekend. And thank you so much again for being on our show. Speaker 1 – 02:04:59 Pleasure. Thank you.

© 2022-2024 Denise Holt, AIX Global Media – All Rights Reserved