Facebook has never been under more scrutiny, especially this week after the Christchurch atrocity. Speaking at the Green Building Council of Australiaโs Transform event this week, a former chief with the global giant stepped through the unintended consequences of the companyโs rapid growth and the subsequent lessons for sustainable digital transformation.
Stephen Scheeler came to Australia in 2013 to run Facebookโs Australia and New Zealand business when the company was first listed on the stock market.
He now goes by the title โthe Digital CEOโ: an advisor, mentor and speaker dedicated to driving the uptake of digital tools in Australian businesses. He says the region is falling behind, putting the economy, businesses and โthe future of Australia for our children and grandchildrenโ at risk.
But as a former employee of Facebook, Scheeler approaches his role with caution.
โWhat this all boils down to is an ethical question. How in the future should we treat the individualโs privacy and their digital identity?โ
Displaying a photo to prove he worked with Mark Zuckerberg, he says that the founder of Facebook is โa genius, not perfect, but smarter than pretty much anyone youโve ever met. And other geniuses have said thatโฆ theyโve had that opinion,โ he says.
โIโm also of the opinion that he wants to do social good and change the world for the better. The things that are happening at the moment mean I think Mark [Zuckerberg] is not sleeping well and that he and the entirety of Facebook are trying to find a solution.โ
Scheeler uses the tech companyโs rapid growth and its various shortcomings to provide powerful context and lessons for businesses and institutions in the process of taking up digital tools.
The platform started off with the intention to connect people โ a mission as valiant as โmotherhood and apple pieโ according to Scheeler.
Although he believes this ethos endures in some capacity, the company has morphed into much more than a platform for users to communicate with one another.
He says there were three key components to Facebookโs rise before the company started to โcome apart at the seamsโ.
The first is that itโs a platform for authentic identity. Unlike Twitter, where thereโs a level of anonymity if you choose, the platform doesnโt accommodate fake or anonymous users and seeks to flush them out.
The implication of this, says Scheeler, is that the platform knows exactly who itโs talking to and when, which is extremely appealing to advertisers.
The next step for the tech company was creating personalisation at scale. Using machine learning and algorithms, the company set out to create news feeds where users see only the most โrelevantโ posts from others.
Finally, the company brought in data science to drive deliberate, data-fuelled growth based on the social imperative that people want to connect with one another.
By drawing out insights from the vast quantities of data becoming available, the company was able to drive user engagement and create the โmodern Facebook we have today.โ
The result, Scheeler says, is a โpersonal, daily newspaper for 1.3 billion humans.โ
So what could go wrong? Whatโs a filter bubble?
First, thereโs the problem of โfilter bubblesโ where users are stuck consuming the same ideas and information, which Scheeler says is โgreat for advertisers but not good for public discourseโ.
Thereโs also the more high profile incidents such as election hacking, data breaches and data misuse, such as the Cambridge Analytica scandal.
Why did it go so wrong?
Scheeler believes Facebook became unstuck for a number of reasons.
The company grew too fast for a start, with small teams of inexperienced developers building products and releasing them to millions of people without considering the potential risks and unintended consequences.
These teams also tend to operate on the assumption that people only use the tools for good, Scheeler added.
Essentially, he says the problem was โtoo much data, not enough ethics.โ
How your innocent photo can end up in an unknown narrative
On a macro level, Scheeler says digital disruption devoid of ethics can lead to the erosion of faith in institutions.
Thereโs also the danger that peopleโs โnano-bitsโ of data are being aggregated into โactionable data setsโ.
This is essentially when benign data โ like a photo of someone walking down the street โ becomes knitted together with other data to be used in various unknown applications.
We need trust and privacy โ and transparency on our data
Scheeler hopes that in the future, everyone and everything will have a โdata identityโ. This means โyouโll be able to answer questions about who has your data and what they are doing with it.โ
He also says people will also have sovereign control over their data and that data ownership will also be rules-based, not unlike the rules for property ownership.
And data will also be monetisable, or not, depending on what people choose โ โmost people donโt know how their data is being monetised at the moment.โ
He says among important tech-enabled development are protocols that allow machine learning without centralised data storage. Called โfederated learningโ, these protocols allow data to be moved from company to company without revealing the personโs identity.
He says the outcome will be that people will start moving to organisations that they trust and that are transparent about their activities. This is already starting to happen, he says, with Facebook in the bottom 10 of a list of 100 organisations ranked according to trust.

It was a great presentation by Stephen Scheeler, and pleased that TFE posted this article.
Facebook is a very interesting case study. There is a big question around whether it is fit to continue to be a pillar of our digital world, or is it simply not capable of evolving fast enough. It is having a prolonged trust crisis and this largely boils down to ethics, and in turn to its values – those that are in its DNA and lived by, rather than espoused. What are those values and to what extent have they evolved from what was set in place by Zuckerberg when we created a tool to assess women on his uni campus? Such values permeate a corporate culture and define the ethical basis of the decisions it makes and the numerous algorithms that Facebook develops – and thus the impact it has in the world for better or worse. A new generation of technology companies may now enter the market founded on solid ethical frameworks, with a clear social purpose and relevant lived values. Could they even be able to compete with the tech titans based on their ability to build much greater trust capital with their stakeholders?
This is of course not about the tech industry, this case study is highly relevant to the property sector, especially as the sector evolves into a digital sector in its own right. As we look at prop tech or smart cities, let’s enquire into the ethical frameworks, the values, that underpin the decisions to deploy the technology and of those organisations that are doing it. If this is clear and appropriate then we are starting to have a sound basis for trust. In time the property industry’s level of ethics and responsibility with the use of technology will be tested; by the community, the media and ultimately regulation. We must be on the front foot with this, just like we were with environmental sustainability.
Read Crossing the Threshold to learning more: https://www.morphosis.com.au/crossing-the-threshold