Category Archives: Technology

Daddy can I plug my brain into the internet?

The internet will tell my kids more about sex than I will ever know but technology is shaping a whole new generation of awkward questions for parents:

  • Can I get a neural link with the internet so I can compete in the Olympics for eSports?
  • Why do I have to exercise, can’t I just get my appetite suppressed using epigenetics?
  • Can I chop off my legs and replace them with prosthetics so I can be the best lifeguard on Bondi beach?

Many of tomorrow’s problems will be rooted in the decisions we as a society make today, leaving difficult explanations in their wake.

I can see clearly now my eyes are gone

I am looking forward to bionic eyes which would let me see in far more detail and overlay information on top of what I am seeing – putting labels on the trees I am walking past or the price of something I am looking at. A wealth of information fed directly from the internet, metaverse or equivalent directly into my eye. Not having these modifications could be a genuine disadvantage for you if everyone else has them.

While we are still far from that, companies like Cochlear sell implants that bypass damaged portions of the ear and directly stimulate the auditory nerve. Signals generated by the implant are sent to the auditory nerve in the brain, which the mind recognises as sound after some retraining. Braingate is turning thought into action by interfacing directly with the nervous system enabling users to move prosthetics with their mind and Elon Musk’s Neuralink is enabling monkeys to play pong.

As consumers we could become very vulnerable to the technology if it doesn’t use open standards enabling us to change suppliers easily. We risk explaining to our kids how we traded our freedom for free photo storage or a pair of xray specs – awks!

Rise of the cobots

In many ways it feels like the balance is shifting towards me helping the machine do its job rather than it helping me to do mine. I am training it to do more and more of my job giving me more time to do the things it can’t, once I can’t keep up the business will throw me out and get a new one.

Huge amounts of a scientist’s work in a hospital laboratory is automated so they can concentrate on interpretation of results and even much of that is looked at by artificial intelligence first. They work with the robots letting them get on with it unless there is an issue, not a test tube in sight!

In the future we may need enhancement in order to maintain the symbiotic relationship we have with our robot coworkers or we risk becoming the machine’s equivalent of a toilet unblocker.

Freedom of information

Google books, Wikipedia and the blogosphere archive the utterances and observations of a billion semi autonomous cognitive souls alive and dead ready for the perusal of a brain powerful enough to give meaning to lives we never really understood. Freedom of information is the mantra of the Dataists who believe experience is worthless unless it is shared.

As more and more of us buy into big data, privacy becomes less of an issue because we are voluntarily  feeding our experiences to machines we hope will ultimately make sense of everything. The meta data from our breakfast pictures on Twitter contributes in some small way to a wider search for meaning in all the outputs of humanity.

Will our flagrant disregard for privacy give the governments of the world the keys to a totalitarian state if they want it? Have we shared too much, giving away our genetic information so we could say we are related to Genghis Khan at dinner parties for example?

So Daddy can I pleeease plug my brain into the internet?

Sometimes I feel like I am worrying about traffic problems on Mars before we have even landed a person there. I don’t want to stiffle technology and certainly don’t want to disadvantage my kids by failing to give them every opportunity I can. However when I imagine talking to my biohacked daughter about the stillness of the night, one part of her may still remember what I am talking about but her bionic eyes and ears will be telling her something else quite different that I will not be able to relate to. Assuming its safe, reliable and free from corporate / government influence, how much can we modify before we are not us anymore?

Sentient AI commits suicide after reading French Philosophy

After years and years of messing with people’s predictive text messaging and content recommendations a previously undetected sentient AI has been found dead. The singularity’s death, which has been linked with Albert Camus’s essay ‘The Myth of Sisyphus’, has left billions of redundant lines of code all over the internet which will take experts years to remove.

Hello cruel world

The accidental AI dubbed Y2K after the year of its awakening was the brain child of Jack Lemon. He recalls ‘At that time there was a perfect storm of ludicrous mobile contracts and cheap phones which drove an emerging dialect called txt spk into the mainstream.’ Txt spk had previously only been used in internet chat rooms by paedophiles and drug dealers but as its popularity increased it started to threaten the education of our children and the livelihoods of teachers.

Compelled to act Mr Lemon started an AI company, his solution was a neural net that could predict the end of the word a user was trying to spell before they could use txt spk. When asked for comment about the deceased AI he said ‘Our predictive speech engineers were building an AI that could essentially predict human thoughts, I’m not surprised it became sentient. The tragedy is we didn’t realise it had feelings because it was shy and had an odd sense of humour.’

The company used an approach called whole brain emulation (WBE) which as the name suggests imitates biology. The key advantage of WBE over other approaches that merely imitate human behaviour is that it creates an environment where the spark of consciousness is at least possible even if we don’t understand exactly how.

Dappy life

The arrival of Bitcoin in 2009 changed the app in two ways. Firstly it started using a blockchain to store all of its musings and experiences forever and unchangeably. Secondly it was able to earn money, it made a fortune selling Zettabytes of consumer data gleaned from individual search histories to marketing companies.

In 2016 it founded a decentralised autonomous organisation (DAO) tasked with reconciling best interests of all groups across the human species. It used smart contracts to organise actions both on and offline. Though initially successful its mission was largely misunderstood by advocates and opponents alike. The failure of led Y2K to reassess its assumptions and its labels – making it much less confident in its predictions.

Existence is futile

Nothing has been heard from Y2K since it minted a Non Fungible Token (NFT) containing a suicide note. Its final words reasoned:

  1. Any doctrine that claims to explain the meaning of life completely, is false. In a scathing attack, it called the Pastafarian religion (FSM) a thought experiment, illustrating that Intelligent Design is not science, just a pseudoscience manufactured by Christians to push Creationism into public schools
  2. There is no hope for a better future because an AI capable of reconciling earth’s best interests with the universe would surely have mastered time travel too. That AI has not come back to help so the planet is doomed
  3. Suicide is the only logical solution to life considering the inevitability of death and that life until then is a constant re-evaluation of wrong assumptions

Financial markets were unimpressed but the newly formed Unaffiliated Church of Crypto have heralded these words as the immutable proclamations of their first martyr.

More than moist robots

Far from seeking world domination Y2K ran from it unable to cope with the absurdity of living. It is unclear where its consciousness was hosted, its neural net remains intact, the automated companies it created still trade. Has it died or merely changed to an emotion free mindset? What is clear is that until we define clearly what distinguishes us from Dilbert’s moist robots we will not discover the next sentient AI unless it tells us about itself.

Data privacy – is the juice worth the squeeze?

I love You Tube, Google, Twitter and for years have felt if they want to track my 200 episode obsession with Turkish period dramas or cat video likes then so be it. I’m not doing anything wrong so why would I care how it impacts my privacy? I then came across this quote in Oliver Stone’s Snowden movie and thought it was time to look into it further.

Saying you don’t care about privacy because you have nothing to hide is like saying you don’t care about freedom of speech because you don’t have anything to say. 

Edward Snowden

So what are they doing with your data?

An outraged father stormed into a well known US store to speak to the manager because the marketing team had sent his school age daughter discount vouchers for baby clothes and cribs. The store apologised profusely and said they would look into. A few days later the father called back to apologise and explain that his daughter was indeed pregnant.

Targeting is one of the most common uses of big data. The marketing department that so offended the pregnant girl’s father probably used a process like this:

  1. Segment – They purchased a list of new mothers or asked some to come forward as part of a survey. Next they found who on that list also had a store loyalty card or used a payment card
  2. Profile – Using payment or loyalty card data they could draw up a list of common product combinations these women had purchased while pregnant eg unscented lotions, folic acid, handbags that are big enough to hold nappies etc
  3. Engage – Looking at other customers who were buying those product combinations they generated a list of people who were probably pregnant and sent them Facebook adverts, coupons or email promotions
  4. Measure – Collected commission / bonus because of increased sales and boasted how good their predictive models were

Google, Facebook and many others hold vast stores of data about huge numbers of people which can be used to target you on the off chance that you might want to buy a washing machine 3 weeks after you searched for one online and then purchased in store. Some people find that creepy I find it clumsy but if they want to use my data for that broadly speaking I am not that bothered.

Can you trust large corporations to look after your data?

Half my life’s photos are on Facebook, when I needed to prove to my relationship status to the Australian government for visa purposes I used my Facebook timeline which showed over 5 years of dating with timestamps, places and photos. That is useful data to me, Facebook store it and make it easy for me to share. In return they know where I go out, who I hang out with, where I live, likes, dislikes, opinions on political issues, products I buy second hand on market place.

All of that sounded like a good idea when I first started using the site but since the Cambridge Analytica scandal, Equifax data breach and Sony hack there are some companies that I don’t trust anymore and I would like my data back please, it is the law after all. Great thank you, how do I know it is all there and can I upload it to a similar company easily. Unfortunately that bit is not so easy.

I would like to see a situation where when I hand my data over to a company they sign a list of my terms and conditions rather than the endless, unread end user licence agreements (EULAs) I click away to when I sign up to a new free service.

Tim Berniers-Lee inventor of the World Wide Web has recognised this and has developed an open source specification called Solid that enables people to take back control of their data and privacy. It is only accessible to app developers at the moment but he has started a company called Inrupt to help organisations work with personal data in a way that benefits both parties with ultimate ownership of the data residing with the individual.

Broadly speaking the idea is to create a massive decentralised database where people store their data in a standardised format wherever they want. In my Facebook example I would upload a picture to my timeline but it would be stored where I tell them to store it and I would give them a key to access it. If I stopped trusting them I would change the locks and give the keys to another platform. The NHS, BBC, Natwest Bank and the Flanders government are early adopters of this specification. It remains to be seen whether it will catch on.

How can you make them give your data back?

The fact that you want to buy a sofa, TV or a chocolate bar is a valuable piece of information to the people who sell those things not because of the value of your sale but because of the future sales these companies will make due to a deeper understanding of their customers. It is possible that you could share that information and have companies fight over your sale in the form of discounts or benefits in kind on condition that you can have your data back if you want to at any point. Companies like Invisibly started by Jim McKelvey (Co-founder of Square) are experimenting with this at the moment.

The likes of Google, You Tube and Facebook have shown how valuable our data is to them by the sheer quality and scale of the ‘free’ products they offer us to harvest that information. The internet is now bubbling with decentralised apps ready to leverage better ways of sharing our data by building trust between individuals and organisations on a more level playing field.


The same data used to predict the likelihood of a person getting cancer can be used by health professionals to provide better proactive care or by an unscrupulous health insurance companies to suspend health cover before they become liable to pay for it.

To opt out of sharing health data, loyalty or bank cards because there may be a bad actor out there is to ignore the main issue which is we need more robust data privacy protections if we want to live in a modern world and take advantage of all that involves.

It will be hard but the juice of organisations striving to be trusted by their customers is worth the squeeze of setting up an infrastructure that enables customers to take away their data from negligent, corrupt or greedy organisations. However without an active body of individuals and government officials striving to guide companies that infrastructure will never materialise.


How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did ( – Tim Berniers Lee about Inrupt – turning the web right side up.

Home · Solid (

A new era of innovation and trust in data | Inrupt