During the CV-19 pandemic, the use of Information Technology has enabled millions to work from home and gain some relief from social isolation while avoiding potential exposure to the virus. After the crisis has passed, however, Australian society will need to reflect carefully about its digital interactions and how best to balance them with wider social engagement. The very benefits of technology—speed, efficiency, cost-savings, immediacy of information transfer, its capacity to cater for individual preference—can actually come at a cost to the community if ‘online’ is made to be the only viable choice.

This short opinion piece was originally drafted well before the CV-19 health crisis began. It has since been partly updated to reflect current circumstances. Essentially, I argue that while increased use of the internet has been useful in keeping society functioning during this era of social distancing, consideration needs to be paid to the balance between our use of Information Technology and other means of human interaction when the present social restrictions are lifted. This short article is a plea for a more generous concept of the future once the current health crisis is but a memory: we sell ourselves short if we talk about the future primarily as a digital screen.

The Juggernaut

While we cannot halt the technological juggernaut in its tracks, we CAN redirect it and choose what path it takes. To do that, however, we need information on how technology is affecting individuals and the society as a whole. It is foolhardy to make decisions that concern the future … in the absence of adequate information and research on their social effects.[1]

So wrote the author Perry Morrison in Australian Society —a journal no longer in print—just over thirty years ago. At the time of writing (1982), the office typewriter was still common, private letters regularly appeared in the mailbox, the public telephone was yet to begin its Quixotic competition against mobile phones and the market for personal computers (PCs) was growing but still relatively modest.[2] Nevertheless, the potential for electronic databases and communication technology to transform society was already clear. As Morrison informed his early 1980s audience,

Eventually systems of the view-data type will make it possible to send mail electronically, to shop without leaving home, and perhaps even to allow the first vision telephones … Without the need to leave the home, people may resort to electronic communication as their socialising medium—with whatever consequences.[3]

Morrison argued strongly that governments and society must not accept technological changes uncritically, because choices about technology ‘will define and shape the society in which we all must live.’[4] Nearly four decades on, the air of caution in Morrison’s words is rather alien to the way in which new technologies have been embraced: institutions and citizens have tended to dive straight into the deep end of the IT swimming pool, seemingly believing that each new technology-driven reform or consumer product is part of human progress as a whole.

The late US academic and commentator Neil Postman’s argument that the computer ‘has usurped powers and enforced mind-sets that a fully attentive culture might have wished to deny it’[5] is as relevant today as it was in 1992. You do not have to be a Luddite to be concerned by the potentially limiting nature of over-application of computer solutions for society’s problems.

 Second-hand Living in the 21st Century

Human beings are sometimes shy creatures, suspicious of strangers and afraid of social encounters and the things that can go wrong. Technology has evolved to the extent that many of us can avoid much face-to-face interaction and thus limit the possibility of feeling vulnerable, uncertain, unprepared and judged. Furthermore, handheld IT devices containing digital games, TV channels, email etc. can offer an immediate antidote to boredom wherever you are in the world. For better and for worse, we now have the power to cancel out our surroundings and immerse ourselves in artificial worlds.

In the years leading up to CV-19, private enterprise and government organisations had been doing their bit to encourage the individual to avoid casual human interaction. Increasingly, the assumption has been made that the citizen/consumer would prefer to interact with a screen rather than a human being. The decades-old phenomenon of the pub with gigantic blaring TV monitors drowning out all signs of life has been joined by the dubious pleasures of self-service using touch-screens: checking out library books, checking in for flights and buying groceries at a self-serve check-out. Further, social services provided by government such as pensions and unemployment benefits are now largely mediated via website transactions with the recipient in charge of self-reporting their circumstances.   

In my view, this trend towards avoiding human contact in routine daily interactions via technology is regrettable. Being served by a librarian, a check-out operator, a public servant or airport staff is a gift. The experience might be interesting, it might be boring, it might be pleasant or unpleasant. It is still a gift, because such transactions keep us well-versed in the fine art of getting along with people. For the lonely and introverted, it may be the only human contact they have all day. Finally, such contact between the ‘server’ and the ‘served’ provides mini-events that are useful conversation fodder and can disabuse us, temporarily, of the belief that our problems and concerns are at the centre of the universe.

No doubt, digital technology has been a useful educational, work and social tool for many during the current crisis. But once the health restrictions on freedom of movement are removed, it is worth reconsidering our digital dependence, which has been developing long before COVID-19 came on the scene.

Digitisation, Business and Government   

There have been many sweeping changes in public life since the 1980s, with an increasing reliance on computers to do what used to be done by other means. Much of this evolution can be interpreted positively. But many if not most digital initiatives are driven by corporations and organisations obsessed with cost-saving, creating real or imagined efficiencies and developing the ‘right’ corporate image. If such priorities were all that mattered, that would be fine. But as Neil Postman pointed out, ‘It is important to remember what can be done without computers, and it is also important to remind ourselves of what may be lost when we do use [computers].’[6]

There is nothing wrong with using technology to make information systems faster and more focused on an organisation’s current needs. The danger occurs when organisers and citizens become seduced by the notion of ‘total transformation’, where old ways of doing things are completely discarded in favour of the uncritical acceptance of the new. During the last decade, the development of ‘e-government’ and digitalisation of services and practices has been increasingly initiated by Australian policy-makers. At its most extreme, the rhetoric surrounding digital reform denies agency and choice beyond savvy adaptation to the new paradigm:

Digital transformation. Digital disruption. Digitalisation. These are more than buzzwords, they are changes that are affecting everyone, including us at the National Archives of Australia. The best response to these challenges is to embrace this inevitable change.[7]

Indeed, ‘digital first’ strategies are being implemented which make online transactions between citizen and government the default setting. The official focus on  e-government suits the managerialist environment in which policy in western countries is conducted.[8] The managerialism ethos is business-oriented, obsessed with outcomes and targets, the progress on which can be quantified through annual reports and other ‘announceable’ data.

Probably the most frequent criticism of online changes to the operation of government services like Centrelink has been the failure to acknowledge fully the ‘digital divide’ in Australia between those who have ready access to ICT facilities and those who do not. Low income earners, the elderly, and  people in rural and remote communities are often placed at a disadvantage in a centralised, ‘digital first’ approach to accessing government services. Patchy mobile phone reception, slow-speed Internet and digital literacy problems among some groups remain a challenge for ‘e-government’s’ effectiveness outside major urban areas.[9]

But it is not just the ‘digital divide’ that policy makers should be concerned about. A ‘Digital by default’ setting for service delivery inevitably distances public servants and politicians from the people whose interests they are supposed to serve. A neat set of computer-generated statistics is deceptively easier for governments to deal with than complaints on the phone or in person about service delivery. While discouraging face-to-face communication, mailed correspondence and phone calls by the public in favour of ‘clients’ making their own way in digital platforms may save money, it is likely to increase distrust of government in the long run if citizens do not have clear choices in how they communicate and interact with public institutions.

Political endorsement of the digital society is nevertheless clear. It may explain why there was little government concern (just prior to the present health crisis) in the plan of one major supermarket in Australia to remove the need for check-out operators altogether in favour of a new system involving artificial intelligence:

‘I have no doubt in the next 10 years customers will be able to take the product off the shelf, put it in their basket, walk out and have it all paid for,’ [Coles executive] Mr Davis said.[10]    

Such official silence about an automation plan that would end a popular pathway of young people into work clashes sharply with the bipartisan political emphasis on employment as a major government concern, along with the official value placed on Australia’s community spirit, especially in times of crisis. If I were to predict the future, I would bet my money on governments eventually making big investments in efforts to rebuild community connections and links in the ‘real world’ that have been lost due to the ‘efficiency’ of digitalising business and government.

Reflections

A society is an ‘organised and interdependent community’.[11] For our society to function well and avoid fracture, we need to interact with other people in small and larger group settings so that we truly understand and appreciate that ‘we are all in this together’. While social distancing is necessary during the pandemic, when the crisis is over, re-establishing multiple ‘real world’ social networks will be vital for our collective recovery. Online communities have their place, but they are no substitute for the spontaneity and challenges of offline social interaction. We need to guard against a kind of technology-influenced individualism which does not encourage personal growth so much as allow users to feel stimulated but comfortably the same. As historian Steven M. Gillon reported on the rise of the Internet:

A few critics pointed out that Internet chat rooms and customized news-groups encouraged people to limit their exposure to like-minded people. Software, which allowed people to customize the information they received, resulted in what one journalist called ‘The Daily Me’—a personalized view of the world filtered to allow exposure only to individuals with similar interests and ideas.[12]

There is no way that western society would collectively contemplate turning the clock back before the age of the PC and the handheld device. But once our ‘war’ against this terrible virus reaches an endpoint, we should think seriously about the way technology is being introduced to our lives. Policy makers and other influential people will need to think more pluralistically when it comes to ‘future-oriented’ reform.

The future cannot be just digital. For it is more important for children to appreciate the natural and social world around them than to master ‘computer coding’. It is more important for an adolescent to be able to relate to others sensitively face-to-face than to hide behind a screen. It is especially important for adults to have an appreciation of a rich past which informs the future.


[1] Perry Morrison, ‘The Overload Ahead’, Australian Society, 3 December 1982, p. 11.

[2] See Steven M. Gillon, The American Paradox: A History of the United States Since 1945, Wadsworth, Boston, 2013 [first published 2007], p. 378.

[3] Morrison, op. cit., p. 12.

[4] Ibid., p. 12.

[5] Neil Postman, Technopoly: The Surrender of Culture to Technology, Vintage Books, New York, 1993 [first published 1992], p. 107.

[6] Postman, op.cit., p. 120.

[7] National Archives of Australia, ‘More than Buzzwords’, NAA Magazine, Issue 4, 2019, p. 43.

[8] See Margaret Sims, ‘Neoliberalism and New Public Management in an Australian University: The Invisibility of Our Take-over’, Australian Universities Review, Vol. 61, No. 1, 2019, p. 22.

[9] See for example Siabhan O’Sullivan & Christopher Walker, ‘From the Interpersonal to the Internet: Social Service Digitisation and the Implications for Vulnerable Individuals and Communities’, Australian Journal of Political Science, Vol. 53, No. 4, 2018, pp. 490-507.

[10] Dominic Powell, ‘Check-out Free Push for Coles’ Supermarkets’, Sydney Morning Herald, 13 January 2020, p. 22.

[11] Bruce Moore, Australian Pocket Oxford Dictionary (Sixth Edition), Oxford University Press, South Melbourne, 2007, p. 1013.

[12] Gillon, op. cit., p. 380.

Lyndon Megarrity
Lyndon Megarrity

Dr Lyndon Megarrity completed his PhD at the University of New England (Armidale), which was awarded in 2002. In recent years, Lyndon has been a lecturer and tutor, teaching history and political science subjects. He was the inaugural history lecturer at the Springfield Campus at the University of Southern Queensland (2012-13) and since taught at James Cook University in Townsville, where he is currently an adjunct lecturer.