2885: From Control to Freedom: Self and the Quest for User Empowerment
Tech Talks DailyMay 05, 2024
2885
38:3430.89 MB

2885: From Control to Freedom: Self and the Quest for User Empowerment

Can technology foster an ethical, user-controlled future? Today's episode features Jonathan MacDonald, a visionary leader and the force behind self.app, a platform dedicated to reshaping how personal AI assistants interact with user data under a banner of ethics and empowerment.

From his early exposure to computers in the 1970s to becoming a stalwart advocate for digital rights, Jonathan has consistently championed the need to reverse big tech's overpowering influence.

Self, rooted in blockchain and quantum computing, embodies his commitment to transparency, control, and human rights in technology. Jonathan shares compelling insights on navigating the complex landscape of fundraising, where human principles often clash with profit-driven motives, and paints a vivid picture of a tech future that prioritizes user empowerment over exploitation.

Join us as we dive deep into the ethical challenges and transformative potential of today's tech landscape with a pioneer who believes in technology's power to enhance human freedom rather than constrain it.

Where is technology taking us, and how can we ensure it remains a force for good? Share your thoughts and continue the conversation online!

[00:00:00] In a digital age where the boundaries between personal freedom and technological encroachment

[00:00:07] continue to blur, how do we navigate that complex relationship between ethics, technology

[00:00:15] and our inherent human rights?

[00:00:17] Well today I'm going to sit down with Jonathan MacDonald, a trailblazer in the realm of ethical

[00:00:23] technology and the visionary behind Self.app. And with a history steeped in the fight for

[00:00:30] user control and privacy dating right back to the 70s, Jonathan offers a compelling narrative

[00:00:37] to the potential of technology to empower rather than diminish.

[00:00:42] And as we explore his journey today from the early days of computing to leading the

[00:00:47] charge against tech's dystopian trends and we've all seen a few of those on our

[00:00:52] news feeds over the last few weeks, Jonathan's going to share his vision for a future where

[00:00:56] technology serves humanity on our terms.

[00:01:01] But what does it take to pivot the tech industry towards a more ethical path?

[00:01:06] And how does Self embody this shift by putting control back into the hands of its users?

[00:01:12] Well these are just a few other questions we're going to explore today so buckle up

[00:01:17] and hold on tight as I beam your ears all the way towards Australia today where Jonathan

[00:01:23] is waiting to share his story.

[00:01:27] So a massive warm welcome to the show, can you tell everyone listening a little about

[00:01:32] who you are and what you do?

[00:01:34] Sure thing, well thanks for having me, my name is Jonathan MacDonald, I've been

[00:01:38] an entrepreneur for almost 40 years now so I'm older than I sound and so now I've

[00:01:46] been in building businesses and health and companies grow small and large including

[00:01:51] my own, I've had 10 startups on my own, I'm on my 11th now and so yeah I'm basically

[00:01:58] always doing one thing which is always the same except people call it something different

[00:02:04] so what I've been involved with is always trying to reverse the power that big tech

[00:02:11] has over the public and reinstate it as subservient to people, to work for people under their

[00:02:19] control including having your data to yourself, sovereign your own agency of your information

[00:02:27] and so that's what I've always been doing since I was a kid since the late 70s and

[00:02:33] so now obviously it's slightly more fashionable than it was in the 70s, 80s and 90s but

[00:02:38] it's still basically bashing your head against the wall and trying to convince the world, richest

[00:02:44] full of humans that it could be a lot more utopian and dystopian if we thought of things

[00:02:50] differently, that's basically what I do.

[00:02:52] And that's so much of what I love about what you're doing here why I invited you

[00:02:56] on the podcast because I'm curious though before we go into this and dig a little

[00:03:01] bit deeper into your story I'd love to find out how it all began for you, you said

[00:03:05] this started right from a kid, what lit the spark in you that entrepreneurial side using

[00:03:11] technology for freedom etc.

[00:03:13] Was there a pivotal moment that put you on this path?

[00:03:17] Yeah I think that my starting point was my father was a computer scientist and an engineer

[00:03:26] for various tech large infrastructure firms like Marconi and NASA and so he bought a

[00:03:35] computer at home, computers home when I was growing up like three four five years old.

[00:03:40] So I was in the mid 70s exposed to long before the worldwide web I mean it's just

[00:03:48] almost 20 years before the worldwide web so this is old school internet and so I've

[00:03:53] always been looking at these tools as what we can do with them for and through my

[00:03:57] non-cynical innocent perspective I figured that we could get these things to help

[00:04:03] us and little did I know that there is an entirely cynical version of that which was

[00:04:11] you human are the battery your attention and tolerance that our junk will feed our

[00:04:19] shareholder value and those concepts don't occur to you when you're five

[00:04:23] and when they do occur to you when you're kind of like 10 or 15 this horrible

[00:04:29] moment of realisation I have stopped seeing your tracks and you give up or you

[00:04:36] double down and go all this is what I'm gonna dedicate this to my career to sort

[00:04:39] it out.

[00:04:40] So much film about that I always say in every episode here that technology works

[00:04:46] best when it brings people together and I love demystifying technology talk

[00:04:50] about a language that everyone can understand and how it can transform

[00:04:53] our lives our work and even world and I think it's all too often in life

[00:04:59] when we're just talking about your childhood there we begin with this go to

[00:05:02] school get job get married get house and I think there's so much more to life

[00:05:07] than that and your work exemplifies that and it's also that that fundamental

[00:05:12] human right to freedom that you seem to be so passionate about as well

[00:05:15] especially in the context of modern business models and technology

[00:05:19] platforms so to set the scene for our conversation today can you just

[00:05:22] share how self-technology embodies this principle everything we're

[00:05:26] talking about here particularly in the contrast to those walled ecosystems

[00:05:30] that are prevalent in the industry right now.

[00:05:32] Yeah for sure I mean my work with self is multi-layered because on the face of it

[00:05:43] I'm involved in creating a personal language model artificial intelligence

[00:05:48] and what we call adaptive AI which is under your control and is your

[00:05:52] personal algorithm with your self-server and that's what it looks like

[00:05:59] right from the go to self.app you'll see that that's the offering as it were

[00:06:05] but the wider picture the broader landscape is somewhat more abstract than that

[00:06:12] what I believe the future of operations systems should work like

[00:06:17] is the viseagnostic way that any application that's used is yours under

[00:06:26] your control and is intelligent to your preferences and desires and demands

[00:06:31] and so for that end game of an operating system or operation systems like

[00:06:37] that are the one that I have been building for eons behind the scene

[00:06:44] is called entirety waiting for two primary three primary things to happen

[00:06:49] one was distributed ledger technology to really work blockchain and the second

[00:06:54] is artificial intelligence to be able to be under your control and the third

[00:06:58] is quantum computing.

[00:07:00] The engine room for that has to be your own personal algorithms and so if

[00:07:06] you kind of build it backward from the end game which arguably is

[00:07:11] actually a starting line the components that are required if we look at

[00:07:16] the bits there I mean I was involved in the crypto world since the 90s the

[00:07:22] early chat rooms of the of the World Wide Web.

[00:07:25] I was a follower of a guy called Tim May and a guy called Eric Hughes and

[00:07:29] guy called John Gilmore and so in that there was quite a rebellious

[00:07:33] street to those conversations that became the cypherpunk movement and

[00:07:37] part of the cypherpunk selectronic mailing list was pretty edgy at the time

[00:07:42] and that was the early days of crypto blockchain and there's the

[00:07:49] B-Money paper in 1998 by a guy called Wadi 10 years later the Bitcoin

[00:07:54] white paper so you know by the time Bitcoin came out not only were we

[00:07:59] all mining coins each day but equally we'd already been talking about it

[00:08:04] 15 years and various things such as that was that part with self on the AI side

[00:08:11] I was watching through the 80s there was the AI winter as it was known 80s

[00:08:15] into 90s and then the machine learning algorithms started to kick in

[00:08:20] quite heavily and IBM and other firms were really kind of investing

[00:08:25] into these spaces and then I noticed that the large language models

[00:08:28] were moving further and further away from what I would see as ideology

[00:08:32] of human rights not least the fact that our first fundamental human

[00:08:37] right according to the United Nations Declaration of Human Rights is that

[00:08:41] all humans are born free and equal in dignity and rights and I've always

[00:08:44] natively figured that that included our own personal tater so quite how

[00:08:49] that's been categorically and and lesbarrously abused is something

[00:08:54] that I find shopped into this day and so started to build self as an

[00:08:58] antidote to that and then of course we need that the the the environment

[00:09:04] of quantum computing to to process things in a far more efficient way

[00:09:08] and luckily because of various different Moore's law exponential growth

[00:09:14] theorem and realities these things are now more possible things that I

[00:09:18] was dreaming about in 1979 became slight more realistic in 89

[00:09:23] quite quite visible in 99 and then watershed moment 2008 2009 2019

[00:09:32] the world started to pick some of these concepts up and piece them

[00:09:34] together and I think by 2029 we'll look back and go well that was

[00:09:39] obvious and everything seems improbable until it's done right so

[00:09:44] it's been a bit of a mische.

[00:09:46] And you said looking back there you know you were right in the heart

[00:09:50] of things from the beginning of the Bitcoin days and how

[00:09:53] everything looks obvious did you spend 10,000 bitcoins on two

[00:09:56] pizzas back in the day do you have any war stories from those

[00:09:59] days?

[00:10:00] Yeah, yeah I spend quite a number of Bitcoin on a char and and

[00:10:04] I'd just say not to talk about it.

[00:10:09] And as you said there you know the United Nations Universal

[00:10:13] Declaration of Human Rights he champions that idea of all

[00:10:15] human beings are born free and equal with dignity and rights

[00:10:19] but you take a look around out there and that's not what we're

[00:10:21] seeing.

[00:10:22] I know this is something you're passionate about so can you tell

[00:10:24] me a little bit more about this mission and the development

[00:10:27] strategies you have itself?

[00:10:29] And in what ways you're ensuring that these rights are

[00:10:32] respected and upheld they don't have to be a pipe dream

[00:10:35] we can make it happen right?

[00:10:37] That's right I mean it's not tremendously complicated to

[00:10:42] change to at least entertain a perspective that's

[00:10:47] different I mean if you have a perspective of big tech which

[00:10:50] is command and control eyeballs, tension, number go up,

[00:10:55] shareholder get rich, human put up with it that's one angle.

[00:11:00] The other version is you know what we could have is a bunch

[00:11:03] of commitments to how you operate your business and how

[00:11:06] you build and so itself which is the engine room of

[00:11:09] entirety the operating system which will start to be more

[00:11:13] visible as the weeks and months go by.

[00:11:18] At the time of recording obviously we're in 2024 where

[00:11:23] this is our year to actually start telling the public about

[00:11:26] what we're doing including light paper and white paper

[00:11:29] and economics and stuff but the commitments that we have are

[00:11:32] the opposite of what I've just mentioned is the kind of

[00:11:34] dystopian view of large tech and we have commitments such as

[00:11:39] decisions that are made have to be in accordance with

[00:11:42] the human rights not just the first one around freedom

[00:11:44] but everything else.

[00:11:45] We also believe that we shouldn't cause or enable harm

[00:11:48] while upholding those rights.

[00:11:50] Now we believe that enabling people to have control and

[00:11:54] ownership over their own data inside how it's used as

[00:11:56] fundamental principle.

[00:11:58] I think that we should all be fully transparent in our

[00:12:02] practices so that people understand how we operate

[00:12:04] not least go on brilliant oddcasts like yours which

[00:12:08] I found it absolutely brilliant that when you were looking

[00:12:11] at the podcast I'm wondering why some of your

[00:12:14] episodes went on there because you recorded so many

[00:12:18] of the first 2000.

[00:12:19] So being transparent is part of being on these kind of

[00:12:24] podcasts and yours especially.

[00:12:26] We also think that upholding regulation and legislation

[00:12:28] is important in the territory so that we're not

[00:12:32] operating against the system abhorrantly to kind of

[00:12:37] stick two fingers up to the machine.

[00:12:38] What I'd prefer to do is try and fix the machine if

[00:12:41] possible or at least one percent at a time or at

[00:12:44] least one life at a time.

[00:12:46] And then finally to not sacrifice any of our

[00:12:48] responsibilities in pursuing commercial gain which

[00:12:51] is probably the reason why vast volume of VCs

[00:12:55] and even private equity in angels have turned their

[00:12:59] noses up with what we're doing because the

[00:13:02] priority is not to commercial gain the

[00:13:04] priority is human rights.

[00:13:06] And and however I believe that if you create a

[00:13:08] significant amount of value to humans then you

[00:13:10] can monetize it as you wish in accordance to those

[00:13:13] rights.

[00:13:13] So for instance people will pay for things that

[00:13:16] they value if they show is something that's

[00:13:19] benefiting their lives.

[00:13:20] They will reluctantly pay for things that they

[00:13:23] have to get your pay all for.

[00:13:25] But over time I don't believe that those bets

[00:13:29] are the most solid or have the most longevity.

[00:13:32] And at the moment the market would say opposite

[00:13:34] and from being rejected by most investors I've

[00:13:38] certainly felt the the wrath of answering

[00:13:43] incorrectly when people say so surely what you

[00:13:45] can do is you're going to harvest everyone's

[00:13:47] data and sell it to the highest bidder.

[00:13:49] And I'm like no that sounds horrible.

[00:13:51] And then there's humble weed you know the

[00:13:53] phone goes dead and and none of the emails

[00:13:55] get responded to.

[00:13:56] But anyway they're the commitments that make

[00:13:58] and and it's not necessarily the most popular

[00:14:01] view and it's a pretty lonely journey.

[00:14:05] But it is not a popularity contact.

[00:14:08] 100% I think it was Jack Parsons that once said

[00:14:11] that freedom is a two-edged sword of which one

[00:14:13] edges liberty and the other edges responsibility

[00:14:16] but I prefer the Spider-Man quote with great

[00:14:19] power comes great responsibility in the

[00:14:22] digital realm though how to self balance these

[00:14:24] two aspects particularly in terms of things

[00:14:26] like user data and privacy because again huge

[00:14:29] topics right now.

[00:14:30] Yeah I think the starting point is who's in

[00:14:34] control and you decide as a tech that's on

[00:14:38] that the user should be in control then it

[00:14:40] has a rabbit hole that you go down which

[00:14:44] is pretty tricky to navigate because the

[00:14:47] end of the Jack Parsons quote he mentioned

[00:14:49] in 1946 is that both edges of that double

[00:14:51] edge sword are exceedingly sharp and the

[00:14:53] weapon is not suited to casual cowardly

[00:14:55] or treacherous hands.

[00:14:56] What that really means is that if you

[00:14:58] if you take liberty on one side which is

[00:15:00] nice and fun to speak of you know people

[00:15:02] are choosing what they do the responsibility

[00:15:06] side of things is well okay so let's imagine

[00:15:09] you create a fully socialized platform

[00:15:12] where everyone can do everything and

[00:15:13] there's no law then how do you handle

[00:15:16] people who want to do malevolent things.

[00:15:20] It's easy when you look at people doing

[00:15:22] benevolent things because they are

[00:15:25] operating probably within the law and

[00:15:28] anyway but if they're doing malevolent

[00:15:30] things and they're wanting to cause harm

[00:15:32] to other people what's responsibility do

[00:15:35] you have as a platform if you've given

[00:15:37] the absolute freedom to people so those

[00:15:40] tricky dinner party questions become

[00:15:43] your strategic backbone and you have to

[00:15:46] have answers to those things and my

[00:15:48] answers to those things is in the

[00:15:49] commitments that I mentioned in terms

[00:15:51] of upholding the regulations and

[00:15:53] legislations in the territories that

[00:15:54] we operate but also not causing harm

[00:15:57] to people by upholding human rights so

[00:15:59] there's a balancing act here where if

[00:16:01] for instance a country government or

[00:16:04] regulatory body said well what we

[00:16:07] require is us to have access to things

[00:16:10] that you've told humans that is

[00:16:13] completely theirs my answer to that would

[00:16:16] be well I'm totally okay with that

[00:16:18] provider that we tell the humans that

[00:16:20] this is what we've been asked to

[00:16:21] provide and allow the humans the

[00:16:23] rights to say no I mean that's not

[00:16:25] absurd right surely that's okay

[00:16:28] surely it's all right to give humans

[00:16:30] the rights to say what happens their

[00:16:32] information and that's that's that it

[00:16:35] seems remarkably straightforward to me

[00:16:38] if you look at it through the eyes of

[00:16:39] this permission and and preference but

[00:16:43] it's extraordinarily difficult if what

[00:16:45] you're trying to do is pull the wool

[00:16:46] over people's eyes and go hey use

[00:16:48] this GPT that form it's fun you know

[00:16:51] you can ask it to write you a

[00:16:52] song or a poem and look how amazing

[00:16:55] this graphic imagery is on mid-journey

[00:16:56] and of course you know we will try and

[00:16:59] cover over the fact that in the terms

[00:17:01] of conditions all of the data isn't

[00:17:02] actually yours anymore and if you upload

[00:17:04] a photo of your kids to Facebook they

[00:17:05] actually own the rights to monetize it

[00:17:07] we'll worry about that later you've

[00:17:09] taken the terms of conditions happy

[00:17:10] days I think that's absolutely

[00:17:13] categorically against everything that

[00:17:17] we should be doing as a society

[00:17:19] agree more as we go further down that

[00:17:23] rabbit hole you've opened up

[00:17:25] the AI word we've done really is

[00:17:27] 15 minutes on a tech podcast before

[00:17:28] we've mentioned it but you are you open

[00:17:30] the door mentioning mid-journey there

[00:17:32] and of course with the rapid

[00:17:33] advancement of AI usually in that

[00:17:36] argument or any conversation around

[00:17:38] it's not too long until we start

[00:17:39] talking about the ethical

[00:17:41] considerations surrounding user data

[00:17:43] and privacy because they're getting

[00:17:45] more and more complex it is very

[00:17:47] straightforward like you said but

[00:17:48] some of the rules around that are just

[00:17:49] incredibly complex so how do you

[00:17:52] itself and this adaptive AI platform

[00:17:54] how do you navigate these ethical

[00:17:56] on chart digital waters and ensuring

[00:17:58] that data is not just protected but

[00:18:00] also in a manner that benefits the

[00:18:02] user to I think there's two sides

[00:18:06] to that as well one it's the one that

[00:18:08] I mentioned in terms of who is in

[00:18:10] control of whose permission is given

[00:18:12] I either you each individual human

[00:18:15] but the other side is is our design

[00:18:18] approach and the design approach

[00:18:20] I don't mean graphical design I

[00:18:21] mean design of our entire

[00:18:23] corporate structure and our strategy

[00:18:26] and the features and the way that

[00:18:30] we come to market are offering and

[00:18:33] in the the decision criteria based

[00:18:37] on the commitments I mentioned

[00:18:38] earlier is easier when you know what

[00:18:41] you stand for and if you have a

[00:18:44] gray area of what what is the right

[00:18:46] decision under these circumstances

[00:18:49] sadly that gray area tends to

[00:18:51] skew towards commercial gain and

[00:18:53] I'm not saying by the way commercial

[00:18:54] game is bad and I think

[00:18:56] commercial gains fantastic it's just

[00:18:58] that if you sacrifice things that

[00:19:01] are right for things that will

[00:19:05] make more profit then your ultimate

[00:19:08] purpose is number go up and and

[00:19:11] then what happens is you end up in

[00:19:13] a situation where you're looking

[00:19:14] at decisions of saying well you know

[00:19:16] is it right for us to quietly

[00:19:20] sell some of the information we've

[00:19:22] got to a market research agency

[00:19:23] no one will ever know about it and

[00:19:25] it will help the banner sheet will be

[00:19:26] able to build lots of really good

[00:19:28] things for humans and wrap holding

[00:19:29] human rights but you know no one's

[00:19:32] really going to know if we kind of

[00:19:34] anonymize information and sell it

[00:19:36] to she earn a market research

[00:19:38] agency we can even class it as

[00:19:40] as providing the world with data

[00:19:42] and that seems quote unquote

[00:19:46] innocent with a small eye.

[00:19:48] Yeah the truth there is is that

[00:19:49] if your design principles are well

[00:19:52] that's even if information is

[00:19:54] anonymized the user needs to decide

[00:19:56] whether or not their anonymized

[00:19:57] information is used in market

[00:19:58] research and more so how about if

[00:20:01] some of the money that comes from

[00:20:02] that market research was then

[00:20:04] given back to consumers at their

[00:20:05] own demand maybe maybe here's a

[00:20:07] here's a outrageous thought maybe

[00:20:09] they can actually set the price

[00:20:12] and so that when if if they want

[00:20:14] us to monetize their information

[00:20:16] who knew maybe they can actually

[00:20:18] say well we want it to happen and

[00:20:20] this is the price we want it to

[00:20:21] happen at and so it's not a

[00:20:23] tremendously complicated it's just

[00:20:26] that it's the opposite of the way

[00:20:29] they used to love that and before

[00:20:33] you came on the podcast I was

[00:20:34] doing a little research on you

[00:20:35] and I was reading self six

[00:20:37] commitments and how they were

[00:20:38] testament to the dedication that

[00:20:40] you have towards human rights

[00:20:41] transparency ethical business

[00:20:43] practices and knowing what you

[00:20:45] stand for writing those mission

[00:20:46] statements and those commitments

[00:20:48] is arguably the easy part so I

[00:20:50] have to ask what kind of

[00:20:52] challenges and triumphs have you

[00:20:53] encountered on your own unique path

[00:20:56] when trying to adhere to these

[00:20:58] commitments especially when you're

[00:20:59] faced with the commercial pressures

[00:21:01] within the tech industry too

[00:21:03] because again it's quite a balancing

[00:21:04] act well yeah well you're right

[00:21:07] and the word commercial pressures

[00:21:09] argue our answers one of those

[00:21:11] struck if the bat the biggest

[00:21:13] challenge of all is is fundraising

[00:21:15] because the majority of people

[00:21:17] think I one of two things one is

[00:21:20] you're talking nonsense it's too

[00:21:21] ideological the other is well make

[00:21:24] a load of profit and then we'll

[00:21:25] believe it and very very very few

[00:21:28] go yeah this is the right thing to

[00:21:29] do we'll back it and it would

[00:21:33] appear to be the most obvious but

[00:21:35] it's certainly the biggest

[00:21:37] challenge because that's not

[00:21:39] the way that the investment seems

[00:21:40] to work on the other side one of

[00:21:43] the upside of the you know the

[00:21:44] positive I guess is that

[00:21:47] the loyalty I mean we just

[00:21:49] in test base are we testing

[00:21:51] people we've got several thousand

[00:21:53] people on waiting this and we

[00:21:54] took feet a hundred couple of

[00:21:56] hundred in at a time there's

[00:21:58] people who anyone god self

[00:21:59] happened to try to step out

[00:22:00] but the loyalty is incredibly

[00:22:05] high when you have your own

[00:22:08] personal AI that remembers you

[00:22:09] and you can call it a name

[00:22:11] and and like Alfred or Simon

[00:22:13] or justine or whoever and that

[00:22:15] then becomes the name of that

[00:22:17] your your your artificial

[00:22:19] intelligence and it will remember

[00:22:21] you if you say to Alexa my

[00:22:22] favorite color is red

[00:22:25] what car should I what what car

[00:22:27] out of these three should I

[00:22:28] choose Alexa will give you

[00:22:31] a version of Google results as

[00:22:34] a reply similar to almost

[00:22:37] any form of platform that's

[00:22:38] theoretically in a system

[00:22:39] whereas a self it actually

[00:22:41] remembers you and will help you

[00:22:43] and can recall the information

[00:22:45] that you've sent sent to it

[00:22:46] but so the biggest upside is

[00:22:48] the fact that the loyalty is

[00:22:49] extraordinarily high I mean

[00:22:51] self app is actually Google

[00:22:52] without adverts right to some

[00:22:53] extent yeah the but the

[00:22:56] bigger the big even bigger

[00:22:58] upside is that what we're

[00:23:00] proving at the moment is that

[00:23:02] if that engine was the

[00:23:04] underpinning engine on an

[00:23:06] operating system then it's

[00:23:08] likely that your attachment

[00:23:10] to the operating system is

[00:23:12] actually your operating system

[00:23:13] if you like that is very

[00:23:17] likely to have the escape

[00:23:19] velocity from the existing

[00:23:20] OSes like Apple and Google

[00:23:22] and Microsoft and that's

[00:23:24] it's a bold thing to say but

[00:23:26] where the escape velocity will

[00:23:28] come is not when we raise

[00:23:29] 500 million or 2 billion the

[00:23:31] escape velocity will come when

[00:23:34] out of 100 people who use

[00:23:36] the systems that are actually

[00:23:38] their own systems the loyalty

[00:23:40] is in double the high double

[00:23:42] digit percentage and when you

[00:23:44] get that level of loyalty

[00:23:45] and people understanding the

[00:23:46] power that they have literally

[00:23:48] within themselves and their

[00:23:50] self then the traction

[00:23:54] likelihood is far more

[00:23:56] efficient than trying to

[00:23:57] convince people of yet

[00:23:59] another Twitter s social

[00:24:00] network but it's not Twitter

[00:24:02] or yet another Facebook but

[00:24:04] it's not Facebook that isn't

[00:24:05] going to do it and funky

[00:24:07] graphics ain't going to cut it

[00:24:08] giving people the power back

[00:24:11] historically and I mean that

[00:24:12] with all with no uncertainty

[00:24:15] historically when people are

[00:24:17] given the power and can

[00:24:19] collaborate together on mass

[00:24:21] stuff gets overturned

[00:24:22] and whilst I have breath in my

[00:24:24] body that will be the mission

[00:24:26] and even if I get to the

[00:24:27] stage of leaving tools around

[00:24:29] for other people to build on

[00:24:30] and that's as much as I can

[00:24:31] achieve that's enough

[00:24:34] if we can do more than that

[00:24:35] that's a bonus.

[00:24:38] Wow and just listening to you

[00:24:39] that it's incredibly inspiring

[00:24:41] and I'm reminiscent really of

[00:24:42] the original vision of the

[00:24:44] World Wide Web by Tim Berners

[00:24:46] Lee it was one of decentralisation

[00:24:49] and equal access and that vision

[00:24:51] as you've said has been

[00:24:52] compromised over time so I'm

[00:24:54] curious what role do you see

[00:24:55] self playing in helping really

[00:24:57] aligning the web with those

[00:24:58] foundational principles because

[00:25:00] it feels like we've lost our

[00:25:01] way over the last few decades.

[00:25:03] Yeah I think the Tim's

[00:25:07] view was twofold one was

[00:25:10] indeed the a decentralised

[00:25:13] system for the freedom of

[00:25:16] information but let's not

[00:25:19] we not forget that the truth

[00:25:22] of it was to meet demand for

[00:25:24] automating information sharing

[00:25:25] between scientists and

[00:25:27] universities and institutes

[00:25:28] around the world.

[00:25:29] So yes there was an

[00:25:31] ideological view and there was

[00:25:34] a practical requirement.

[00:25:36] I think that what we have to

[00:25:39] be remiss of me not to

[00:25:42] try and manifest what it

[00:25:45] would have looked like if the

[00:25:47] World Wide Web was based

[00:25:50] on our individual sovereignty

[00:25:52] and it's as I say these words

[00:25:56] out loud with a smile the

[00:25:59] challenge is not one of

[00:26:01] ideology I don't find it hard

[00:26:04] to make the case for it from a

[00:26:06] human perspective but the sheer

[00:26:09] adverse view of big tech

[00:26:14] that are trillion dollar

[00:26:15] mark out companies in some

[00:26:17] it's some extent is is so

[00:26:20] opposite and opposite to this

[00:26:23] that the it becomes

[00:26:25] always stay marketing vehicle

[00:26:27] in and of itself.

[00:26:28] And what I mean by that Neil is

[00:26:30] it shouldn't be

[00:26:34] any form of inspiration or

[00:26:36] thought leadership to say that

[00:26:38] hey guys how about we build

[00:26:40] things for the people that

[00:26:41] actually are under control

[00:26:43] by the people.

[00:26:45] Shouldn't be

[00:26:47] shouldn't be anything other

[00:26:48] than well yeah.

[00:26:49] Darn but in fact it's like

[00:26:52] hey guys we're going to build

[00:26:53] something that humans will

[00:26:55] pretty much hate in so as

[00:26:57] user experience but we're going

[00:26:58] to suck them into it because

[00:26:59] basically they're addicted to

[00:27:01] it and we're going to cause

[00:27:02] greater depression and suicide

[00:27:04] rates but we're going to make

[00:27:05] zillions and everyone goes

[00:27:07] yep here's a check for

[00:27:09] billion dollars and so

[00:27:11] it's a strange old world

[00:27:12] that should be that should be

[00:27:14] the kind of eye opening

[00:27:15] eyebrow raising thing that

[00:27:17] people go well hold on a

[00:27:18] minute.

[00:27:19] That's not the way you want

[00:27:20] to run a society.

[00:27:21] That's not future that we

[00:27:22] want but it's the opposite

[00:27:24] of that it's like you come

[00:27:25] up with something that makes

[00:27:26] complete basic human sense

[00:27:28] and everyone says with your

[00:27:29] nuts and then you build

[00:27:31] something that goes against

[00:27:32] human rights and

[00:27:33] and actually screws people's

[00:27:35] lives up and your ability

[00:27:37] to straight.

[00:27:39] It really is and it feels

[00:27:41] like we've almost convinced

[00:27:42] ourselves that the magnificent

[00:27:43] seven in big tech who are

[00:27:45] also currently propping up the

[00:27:46] stock market right now that

[00:27:48] they make billions from

[00:27:49] every click swipe that we

[00:27:50] make and hey that's just the

[00:27:52] way it is and you've

[00:27:53] mentioned the potential

[00:27:54] of for technology structures to

[00:27:56] enable rather than restrict

[00:27:58] freedom.

[00:27:58] I absolutely love that

[00:28:00] line.

[00:28:00] So can you provide any examples

[00:28:02] of how self is is crafting

[00:28:04] such structures particularly

[00:28:05] in enabling users to have

[00:28:06] control and ultimately

[00:28:08] ownership over their data?

[00:28:10] Yeah, I think the starting

[00:28:12] point is that that giving

[00:28:13] people the rights to access

[00:28:15] what's recorded and where

[00:28:16] things sit in the database.

[00:28:18] I mean in self when you

[00:28:19] create an account the the

[00:28:21] there's a preference screen

[00:28:23] which is what your AI has

[00:28:24] remembered and you can then

[00:28:26] change it.

[00:28:27] Now, can you imagine that

[00:28:28] with any of the AI

[00:28:29] if that's your type?

[00:28:31] You imagine going into

[00:28:32] chat, GBT or any of the others

[00:28:34] and be able to see what's

[00:28:35] recorded in the background?

[00:28:36] Imagine?

[00:28:37] Yeah, I should say.

[00:28:38] No.

[00:28:39] Yeah, it's you're not allowed

[00:28:41] to see any of that.

[00:28:42] That's not that's not for you

[00:28:43] but the so it starts

[00:28:46] there, you know, freedom

[00:28:47] to look at what's recorded

[00:28:48] on your database is

[00:28:50] like pretty pretty basic

[00:28:53] but moving forward.

[00:28:54] I think what's going to be

[00:28:56] the most fun is where people

[00:28:57] can see their own neural net

[00:29:00] their digital twin neural net

[00:29:01] to actually look at

[00:29:02] the self brain and see how

[00:29:04] they use it and how

[00:29:07] their brain responds to them.

[00:29:10] And if you imagine that in the

[00:29:11] in the context of an

[00:29:13] what we call entirety

[00:29:15] and it's called entirety

[00:29:16] because because freedom is everything

[00:29:18] in the entirety OS

[00:29:20] be able to visualize

[00:29:22] at a very simple, non-techie

[00:29:24] way how things are linked together

[00:29:27] how your apps are linked together

[00:29:28] and how they operate

[00:29:30] will mean that we get to stage

[00:29:32] without needing to open 16

[00:29:34] browser tabs to

[00:29:35] to organize something or to download

[00:29:37] five different apps that

[00:29:38] evidently have no interest

[00:29:40] in speaking to each other

[00:29:42] in entirety OS

[00:29:43] and the developers SDK

[00:29:45] will have self built in

[00:29:47] but all of your apps

[00:29:48] are automatically linked

[00:29:50] together and also all serving you.

[00:29:52] So that's what I mean

[00:29:53] by freedom in terms of

[00:29:56] freedom of not only data sovereignty

[00:29:57] but freedom of choice and freedom

[00:29:59] of information that's

[00:30:01] freedom of visualizing your information

[00:30:03] and freedom of how things are mapped together.

[00:30:05] And what we have to do is make it

[00:30:07] as easy as humanly possible.

[00:30:09] And that's why just the V1

[00:30:12] of self

[00:30:13] and shows people

[00:30:15] in a like a really basic

[00:30:17] zoomed in way.

[00:30:18] This is what self's recalled

[00:30:20] and it's only

[00:30:21] it's only the first

[00:30:22] hey, you know, we've been

[00:30:23] building for ages but the

[00:30:25] we're now able to

[00:30:26] to offer these things and she

[00:30:28] and human rights starts with our freedom of

[00:30:30] I think a freedom of our self

[00:30:32] and where our information is

[00:30:34] and how it's used.

[00:30:35] Why not be our own control?

[00:30:36] That's what I think.

[00:30:38] William Elliott, it's a perfect moment

[00:30:40] to end on but before we do

[00:30:41] we start at the podcast

[00:30:43] talking about your origin story

[00:30:44] the path that you've been on

[00:30:46] and in a career where

[00:30:48] you've got extensive experience

[00:30:49] advising on change, digital

[00:30:51] transformation, innovation

[00:30:52] and someone that's been

[00:30:54] that's seen the world of tech

[00:30:55] evolved from those early days

[00:30:57] before the worldwide web

[00:30:58] to what it's become now.

[00:31:00] We're now looking towards the future

[00:31:01] things are moving faster than ever

[00:31:03] what emerging trends

[00:31:05] do you think will significantly

[00:31:06] impact the ethical

[00:31:08] technology landscape in the near

[00:31:10] future and how are you

[00:31:11] with self preparing to navigate

[00:31:13] some of these changes that you're

[00:31:14] seeing?

[00:31:15] Sure, I think that

[00:31:16] in very, very quick answers

[00:31:17] that actually because it's

[00:31:18] it's a simple answer

[00:31:19] but I will leave it

[00:31:20] with a listeners to digest

[00:31:21] the granularity of it.

[00:31:23] The

[00:31:25] we are moving toward an even

[00:31:26] more visible crossroads

[00:31:28] between a utopian and dystopian

[00:31:30] outcome

[00:31:31] and I think both will work in

[00:31:32] duality.

[00:31:33] I think we'll continue

[00:31:35] in the big tech direction

[00:31:36] of the magnificent seven

[00:31:37] and I think that offerings

[00:31:39] like what I'm working with

[00:31:41] and I hope millions more

[00:31:43] offerings will show

[00:31:45] the alternative path

[00:31:47] and that the more visible

[00:31:50] that alternative path gets

[00:31:52] the more opportunities there

[00:31:53] will be for more firms

[00:31:56] and organizations and entrepreneurs

[00:31:57] to innovate in that space.

[00:31:59] So that's what I suspect

[00:32:00] future looks like

[00:32:01] and how we're going to navigate

[00:32:03] that is make that path

[00:32:04] as blatantly clear as possible

[00:32:06] and promote that path

[00:32:08] and celebrate the path

[00:32:09] that others are on and try

[00:32:11] and stop the or reduce

[00:32:13] the abuse of the terms

[00:32:14] inside that path.

[00:32:15] Like for instance,

[00:32:17] we're an ethical tech company

[00:32:18] which people use

[00:32:19] without actually having ethical

[00:32:21] or moral principles

[00:32:22] and in every single decision

[00:32:23] making or people say

[00:32:25] decentralization like for instance,

[00:32:26] a theorem which is as

[00:32:28] central as centralized

[00:32:29] as an as a back.

[00:32:31] So the truth

[00:32:33] of the matter is the words

[00:32:34] decentralization ethics

[00:32:37] even the word freedom

[00:32:38] even the word rights

[00:32:39] corporate social responsibility

[00:32:42] philanthropy corporate giving

[00:32:45] these are all abuse terms.

[00:32:46] These these are these are said

[00:32:48] because it makes people happy

[00:32:51] and they're not actually done

[00:32:53] to their actual full

[00:32:55] and ontological extent.

[00:32:57] And as we move forward,

[00:32:59] we will notice and be able

[00:33:00] to call out those BS terms

[00:33:03] in buzzwords and greenwashing

[00:33:05] onward and we'll start

[00:33:07] to see what it truly really means.

[00:33:09] And I'm I'm really, really happy

[00:33:11] that I'm able to help

[00:33:14] navigate some of that.

[00:33:15] And I'm hoping that

[00:33:16] we're part of the waves

[00:33:18] and movement of philosophical

[00:33:20] revolution in light

[00:33:21] where we show an option

[00:33:24] to the what I think is a brainwashed

[00:33:27] dystopian outcome future.

[00:33:29] And I'm happy to be where we are.

[00:33:32] Well, I cannot thank you enough

[00:33:33] for spending your little time with me today.

[00:33:35] Sharing your insights

[00:33:36] with everyone listening

[00:33:37] and you didn't mention

[00:33:38] the word manifest a few moments ago.

[00:33:40] So as a fellow subscriber

[00:33:42] to the law of attraction,

[00:33:43] I'm going to see if there's

[00:33:44] something that we can do for you now,

[00:33:46] because some of the biggest names

[00:33:47] in business, VC, funding and tech

[00:33:49] have either been guests

[00:33:50] or may even listen to this podcast.

[00:33:52] So is there a person

[00:33:53] you'd love to have a breakfast or lunch with?

[00:33:56] And who would that person be?

[00:33:57] And why?

[00:33:58] Because he or she might just

[00:33:59] be listening to this episode today.

[00:34:01] So let's see what we can manifest together.

[00:34:03] But who would it be?

[00:34:05] I don't know their name

[00:34:07] or their names

[00:34:08] and but whoever it is

[00:34:12] that someone may have listened to this and gone.

[00:34:16] I or we would love to back

[00:34:20] something that is building the alternative bar

[00:34:24] and please, please,

[00:34:28] please do get in touch because

[00:34:30] it is that the really

[00:34:33] collaborating with people

[00:34:34] who want to be part of building a future

[00:34:37] and are not thinking, well,

[00:34:39] where we want the next unicorn

[00:34:41] story that that

[00:34:44] and resonate with this manifesto.

[00:34:47] And I mean, you can go to self.at for a manifesto.

[00:34:50] You can even search on your favorite search engines,

[00:34:53] the word pursuit of human freedom.

[00:34:55] And then my name Jonathan Macdulles

[00:34:56] and you can read even more of my back story

[00:34:58] of whatever results you click on

[00:35:01] in pursuit of human freedom.

[00:35:03] People who resonate with that

[00:35:05] you'll know if it's you.

[00:35:06] If you're listening to this and go

[00:35:08] stop the podcast, I need to contact immediately.

[00:35:11] It's just J at self.at and do it.

[00:35:15] And yeah, if it doesn't resonate, then

[00:35:17] then I wish the other path all the best.

[00:35:20] But if it does, then, well, Neil,

[00:35:22] if you and I have managed to manifest that,

[00:35:25] then then it will be worth its weight in gold

[00:35:27] and much appreciated.

[00:35:28] You'll be doing the right thing for humans.

[00:35:31] I think that's a beautiful moment to end on.

[00:35:33] So I will throw that out into the universe,

[00:35:35] into the ether. Let's see what we can manifest there.

[00:35:37] I'll also add links to everything you mentioned there

[00:35:40] just so people can find out anything

[00:35:42] that they need more easily.

[00:35:44] And there's so many big things on your site there

[00:35:46] from those six commitments, make decisions

[00:35:48] that are in accordance with human rights

[00:35:51] to not cause or enable harm while upholding those rights

[00:35:54] and able people to have control

[00:35:56] and ownership of their own data and decide how it's used.

[00:35:59] It is so straightforward, as you said,

[00:36:02] but it seems that we have lost our way

[00:36:03] and gone completely in the wrong direction.

[00:36:06] But we are approaching those crossroads.

[00:36:08] So for anyone listening,

[00:36:09] I do words that they check you out

[00:36:11] and maybe be a part of that movement too,

[00:36:13] but more than anything,

[00:36:14] just thank you for sharing your story today.

[00:36:16] Thanks Neil, thanks for having me, Meg.

[00:36:18] Reflecting on my conversation with Jonathan,

[00:36:21] I think it's clear that the intersection

[00:36:22] of technology and ethics,

[00:36:25] it presents both a formidable challenge

[00:36:27] and also a beacon of hope for the future.

[00:36:31] And Jonathan's commitment to reversing the power

[00:36:33] of dynamics of big tech

[00:36:35] and championing user privacy

[00:36:37] and control through self

[00:36:39] is a testament to the possibility

[00:36:41] of a utopian tech future.

[00:36:43] I've said many times on this podcast,

[00:36:45] how many tech or science fiction films

[00:36:47] have you seen that show a positive future?

[00:36:51] If you can think of one,

[00:36:52] send me an email now, techbloggerironoutlook.com.

[00:36:55] I'd love for you to share them

[00:36:56] because most films always show the dystopian side

[00:37:00] and we need more optimism.

[00:37:02] So amid the hurdles of fundraising

[00:37:04] and the pervasive pursuit of profit over principles,

[00:37:07] Jonathan's dedication to an ethical alternative path.

[00:37:12] For me, it illuminates a vital role

[00:37:14] of human rights in shaping technology.

[00:37:17] And as we all ponder the crossroads

[00:37:19] between dystopian and utopian tech futures,

[00:37:23] Jonathan's insights compel me

[00:37:25] to consider our role in this narrative.

[00:37:28] So how are we going to contribute

[00:37:30] to fostering an ethical technological landscape?

[00:37:33] What steps can we take to ensure

[00:37:35] that technology amplifies

[00:37:37] rather than undermines our freedoms?

[00:37:40] Over to you, email me, techbloggerironoutlook.com,

[00:37:43] Twitter, LinkedIn, Instagram, just at Antony or C Hughes.

[00:37:47] I'd love to hear your thoughts on this one.

[00:37:49] But as we close this chapter of our discussion,

[00:37:51] let's carry forward the conversation

[00:37:53] on ethical technology and its profound impact on our lives.

[00:37:57] What actions will you take

[00:37:59] to support a future where technology

[00:38:02] upholds our dignity and our rights?

[00:38:05] So much to think about,

[00:38:06] and you've got a few hours to think about that

[00:38:08] and let me know your thoughts.

[00:38:10] But other than that, I'll return again tomorrow

[00:38:11] with another topic to keep you thinking

[00:38:14] and keep that curiosity alive.

[00:38:16] But thank you for listening today

[00:38:18] and until next time, don't be a stranger.