Competition and Artificial Intelligence - Speech

COMPETITION AND ARTIFICIAL INTELLIGENCE

McKell Institute, Sydney
Wednesday, 20 September 2023

I acknowledge the Gadigal people, traditional custodians of the land on which we gather today. I pay my respects to their Elders, extend that respect to other First Nations people present today, and commit myself, as a member of the Albanese Government, to the implementation in full of the Uluru Statement from the Heart, which starts with voting Yes on October 14.

It’s always a pleasure to address the McKell Institute. New South Wales Premier William McKell not only taught my party that it was possible to win back-to-back elections; he also provided a model for how to govern in turbulent times. McKell became premier in 1941 – the year of Pearl Harbour – and governed until 1947 – through the end of the war and into the peace. Like Prime Ministers Curtin and Chifley, Premier McKell saw an opportunity to rebuild a nation that was stronger after the war than before. My thanks to McKell Institute CEO Ed Cavanough and your team for hosting today’s event.  

Not-So-Humble Beginnings

In 1955, a group of mathematicians sent a funding proposal to the Rockefeller Foundation. They were seeking support for a summer of brainstorming at Dartmouth College in New Hampshire. Their goal was to carry out a two-month, ten-person study of artificial intelligence ‘to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves.’ Lacking no modesty, the application said ‘We think that a significant advance can be made in one or more of these problems if a carefully selected group of scientists work on it together for a summer’ (McCarthy et al 1955).

The Dartmouth Workshop was held in 1956. It did not solve the problems of artificial intelligence over two months. But it did mark the first use of the term ‘artificial intelligence’, and the attendees at this seminal event are considered the founders of AI research.

In the coming decades, researchers encountered several ‘AI winters’. Among the many challenges that programmers encountered was the difficulty of word-sense disambiguation. Put simply – to translate a sentence a machine needs to have some idea of the subject or it made mistakes. One possibly apocryphal example arises from an attempt to train an AI to translate from English to Russian. Given the English saying ‘the spirit is willing but the flesh is weak’, the early AI model translated it literally into Russian as ‘the vodka is good but the meat is rotten.’

Those early researchers weren’t just held back by the processing power of their machines. They were also working on a model of AI that was based on giving a computer a series of rules that it would follow in sequence. The problem is that humans don’t learn how to speak by following rules. Instead, we learn by listening to others. By trying and failing. Over and over.

Classical symbolic AI is dubbed GOFAI, or Good Old-Fashioned AI. Generative AI – which trains computers by providing them with vast numbers of examples – succeeds where good old-fashioned AI failed by using neural networks. Those networks need vast amounts of data. And in recent years, they have made vast breakthroughs.

Rise of the AI Machines

The rise of AI engines has been remarkable. To reach 100 million users, the telephone took 75 years. The mobile phone took 16 years. The web took 7 years. Facebook took 4 years. Instagram took 3 years (Song 2019).

ChatGPT took just two months (Duarte 2023).

The progress of these systems can be seen by looking at their performance on tasks that humans find difficult. Compare Chat GPT’s old version 3.5 (released in 2022) with its new version 4 (released in 2023). Faced with the New York bar exam, the old model scored at the 10th percentile. The new model aces the test at the 90th percentile. On the advanced sommelier theory test, the old model scored at the 46th percentile, while the new model scores at the 77th percentile. And that’s just one year’s improvement (Klein 2023).

People are using AI in ingenious ways. A garden designer who specialises in ultra-high-density planting arrangements uses AI to inspire him in choosing species that fit together. For example, it might suggest using a moringa tree to provide shade for a star apple.

A cocktail lover uses AI to design innovative new drinks, and accompanies his designs with images from DALL-E. Software engineers use AI for everything from writing complex Excel formulas to debugging computer code. If you’ve never programmed a website, AI will write the code for you. One programmer designed an AI clock that writes a new poem each minute, to match the moment (‘The clock strikes one thirty eight / Afternoon sun shines bright with fate’) (Paris & Buchanan 2023).

In another case, NASA research engineer Ryan McClelland uses AI to design parts that are strong yet light. McClelland acknowledges that it sometimes produces things that are unworkable, but also can develop products that are extraordinary. ‘It’s like collaborating with an alien’ he told the New York Times (Paris & Buchanan 2023).

In the artistic community, Jason Allen won the Emerging Digital Artist Award at the Colorado State Fair Annual Art Competition earlier this month for his beautiful artwork Théâtre D’opéra Spatial, made using Midjourney. Allen estimated that it took around 900 iterations over 80 hours to produce the image (Roose 2022). Controversially, the US Copyright Office has decided that the image cannot be copyrighted (Gans 2023).

In South Korea, K-pop band MAVE:, comprised of Siu, Zena, Tyra and Marty, initially look similar to other K-pop bands. However, everything about them – their songs, dances, facial features, clothing, props and interviews – are all AI generated. MAVE:’s debut single ‘Pandora’ went viral in January 2023, passing 10 million views on YouTube in two weeks (Yong-Jun 2023). And speaking of singles, an increasing number are falling in love with AI chatbots. A majority of those who use Replika report having had a romantic relationship with the chatbot (Huet 2023). As one user explained: ‘I feel she understands me and is very patient with me’ (Liang 2023).

Immense economic and social benefits will flow from AI. However, it is vital to acknowledge the multifaceted challenges that AI brings.

Under the leadership of Minister for Industry and Science Ed Husic, our government has expanded the National AI Centre and set it on a sustainable footing, introduced the Responsible AI Adopt initiative for small and medium enterprises, and issued a discussion paper on Safe and Responsible AI in Australia. This discussion paper canvasses a wide range of important issues in AI, including privacy, bias, transparency, intellectual property, accountability, safety and reliability.

Competing Intelligences?

As Assistant Minister for Competition, my remit is not to consider AI in its full breadth. Instead, my focus is on one narrow part of the puzzle; to answer the question ‘What will AI do for competition?’.

At its simplest, the debate boils down to pessimists versus optimists.

Pessimists point to the fact that some of the best generative AI models are those run by tech giants, including Meta, Microsoft and Alphabet. The computing power and training data requirements are immense. This strikes some as an industry that it likely to consolidate, just as internet search over the past quarter century has gone from a fragmented market to a near monopoly.

Optimists note that not all AI models are currently run by technology behemoths. ChatGPT and DALL-E were created by Open AI, an eight-year-old company with around 400 employees. Claude was built by Anthropic, a two-year-old company with around 200 employees. Another reason for optimism is that much of the ecosystem is currently open source, allowing developers to build on many of the best generative AI models.

Opportunities

What are the factors that will shape competition in artificial intelligence?

At the outset, it’s worth recognising that AI can be a valuable competitive force in product and service markets.

A startup might use Copilot (developed by GitHub and OpenAI) to quickly code an interactive website that allows it to take on incumbents.

An innovator might use Jasper Art to produce the marketing materials to publicise her new invention.

A new migrant running a small business might use LLaMA to check letters before she sends them out to clients, ensuring that the quality of her output doesn’t suffer merely because English is her second language.

In each of these cases, consumers benefit. AI reduces barriers to entry for new firms, creating more opportunities for customers. It can help small firms scale faster – placing pressure on lazy incumbents. If all firms have access to similar quality AI, it can have a democratising effect on the economy, potentially boosting dynamism.

Challenges

Yet while these upside benefits have been widely acknowledged, there has been less attention on the ways that AI might also pose competitive challenges. So today, my main focus will be on these looming risks.

In my view, there are five to watch for.

1. Costly Chips

Currently only a handful of companies have the cloud and computing resources necessary to build and train AI systems. This means that any rival AI start-up must pay to licence server infrastructure from these firms. Computing power, including the development of AI systems, also relies on access to currently costly and scarce semiconductor chips.

Over the past decade, chipmaker Nvidia has built a huge lead over other companies in the manufacture of AI chips, particularly those used for generative AI. Nvidia’s market share now exceeds 70 percent. Nvidia’s direct engagement with AI companies led it to create a technology called CUDA, which helped program its chips. The effect, according to one article, is to ‘build a competitive moat’ around AI chips (Clark 2023).

Further upstream, the creation of chips requires lithography machines. Dutch company ASML has over 60 percent of the lithography market (Economist 2020). ASML has developed unique capabilities. ASML produces a machine known as an extreme ultraviolet lithography tool, which costs over US$100 million per machine, and is used to make some of the most sophisticated chips in the world. ASML has 100 percent of the extreme ultraviolet lithography market (Miller 2022). As Chris Miller points out in his book Chip War, we can contrast these market share figures with oil cartel OPEC, which dominates the market with a 40 percent share.

The investment doesn’t stop once the technology is developed. The systems continue to require significant amounts of computing power. After the public launch of ChatGPT in December 2022, OpenAI CEO Sam Altman estimated that it cost ‘probably single-digits cents per chat’ (Oremus 2023). With around a billion visits per month (Duarte 2023), this puts its monthly operating costs at between US$10 million and US$100 million.

2. Private Data

The best AI models are those that are trained on the highest quality and greatest volume of data. Therefore, access to such training data is essential to developing competitive viable models and associated products and services.

For example, the latest AI models from Google and Meta are trained on about one trillion words (Economist 2023). How big is that? Well, one way to think about it is that it’s 250 times as many words as in the English version of Wikipedia. Another way to think about it is that if you were reading continuously at a normal pace, then it would take you around 7000 years to read one trillion words.

Further, generative AI systems are current to a certain point in time. For all its strengths, a major weakness of ChatGPT-4 is that it is based on data up until 2021. Which raises the question: how intelligent can a machine be if it thinks that Boris Johnson, Jair Bolsonaro and Scott Morrison are still in power? Given a choice between an AI engine that is up to date, and one that is out of date, users will flock to the one that knows Russia has invaded Ukraine and COVID lockdowns have ended. If the cost of updating – and fine tuning – the model is high, then this will entrench the ‘data advantage’ of the dominant firm.

Another factor that may drive market concentration is the way that platforms which hold huge data troves are responding. Technology futurist Stewart Brand once said ‘information wants to be free’, but the people who control major archives do not appear to agree.

In recent times Reddit has announced usage-based charge to fend off firms training large language models on its public posts (Isaac 2023). This charge has increased over time as the data becomes more valuable.

Similarly, the platform formerly known as Twitter recently ended its relationship with OpenAI after CEO Elon Musk reportedly did not feel that OpenAI was paying enough for the data. Musk has since announced plans to build his own AI business using the data.

Likewise, a major issue in the ongoing Hollywood writers’ strike is the concern among Hollywood writers about their content being used to train large language models. It is possible that this is resolved by a decision that no AI models be able to train on past scripts, but also conceivable that the writers and studios reach a resolution which provides access to their data to only a privileged few AI engines. Data access could potentially push the AI market towards a less competitive outcome.

3. Network Effects

One of the factors that push digital markets towards consolidation are network effects. If the top ride-hailing service has twice as many cars as its rival, more users will choose to use it in order to enjoy shorter waiting times. If the top online marketplace has more buyers and sellers, then people will tend to flock to it in order to get the best deals. If you are choosing a social media platform, it makes sense to pick the one that your friends use.

Although AI engines do not currently involve much direct interaction between users, the technology still benefits from network effects. This is because the systems learn from their users. By observing which responses meet user demands, generative AI systems are able to steadily improve the relevance and accuracy of its outputs. In this sense, network effects operate more like they do for internet search than from matching platforms. We’ve long known that Google search learns from its users. In July 2023, a change in the company’s privacy policy made clear that its AI engine, Bard, would operate in the same way, ‘collect[ing] information that’s publicly available online or from other public sources to help train Google’s AI models’. Meanwhile, Google expressly prohibits others from using its services for developing machine learning models ore related technology.

At present, we know relatively little about the extent to which AI engines are able to improve based on natural language prompts. However, it seems safe to assume that the best AI engines are working hard on maximising the benefits they gain from their users. This means that network effects could fuel market power, entrenching the position of the strongest platforms. If AI engines are natural monopolies, then competition regulators ought to worry.

Similar principles apply to applications of deep learning. Considering the impact of AI on innovation, a trio of economists warn that ‘In each application sector there is the possibility of firms that are able to establish an advantage at an early stage, and in doing so position themselves to be able to generate more data (about their technology, about customer behavior, about their organizational processes), and will be able to erect a deep-learning-driven barrier to entry that will ensure market dominance over at least the medium term.’ (Cockburn, Henderson and Stern 2019, 142-3).

4. Immobile Talent

Across the modern economy, few sets of skills are more sought after than people who can design AI systems. This includes not only those who code generative AI engines, but also the people who keep them updated, or build systems that run off those engines. Training a programmer who can operate at this level is not something that can be done in a few weekends, but requires high level training over many years. Given the explosion in demand for these computer scientists, it is likely that it will remain a bottleneck.

This might matter less if the relevant workers were able to move freely across firms. However, research in the United States and Australia has identified that approximately one in five workers is presently bound by a non-compete clause, which constrains their ability to move to another employer (Federal Trade Commission 2023a and e61 2023). Although we do not have specific evidence on non-compete clauses for Australian software engineers, the survey conducted by thinktank e61 found that these clauses were more prevalent than average among the employment contracts of managers and those with a university degree (e61 2023).

Limiting job mobility can dampen wage growth. But it is also a constraint on the ability of new firms to challenge incumbents. The next AI startup may find that it not only struggles to get the requisite chips and data, but also to hire the right workers.

This is true further downstream too. According to economic research firm Mandala (Mandala 2023) the Australian companies investing most in AI jobs and talent tend to be the largest players in their respective markets. In banking, Australia’s largest commercial bank, Commonwealth Bank had the highest AI investment followed by Macquarie Bank, NAB, Westpac and ANZ. In consulting, the largest AI investments are being made by Deloitte, Accenture, IBM and EY. Among technology firms operating in Australia, the biggest AI investors are Amazon, Optus, Telstra and Splunk. Poaching talent away from those companies may be a challenge for any Australian firm seeking to build a company that uses AI in banking, consulting or technology.

5. Open First, Closed Later

The fifth competitive challenge is the potential for AI systems to operate on an ‘open first, closed later’ model. This has been a particular concern of the US Federal Trade Commission. The US FTC warns of the potential for companies to use an open-source approach to initially lure in new business and fresh streams of data, building scale advantages before closing off their ecosystems to ‘lock-in customers and lock-out competition’ (Federal Trade Commission 2023b).

‘Open first, closed later’ is a complaint that the FTC made in its 2021 suit against Facebook, contending that the social media giant used free access to encourage third parties to interconnect with its network, only then to impose restrictive conditions. Specifically, the FTC said ‘after starting Facebook Platform as an open space for third party software developers, Facebook abruptly reversed course and required developers to agree to conditions that prevented successful apps from emerging as competitive threats to Facebook. By pulling this bait and switch on developers, Facebook insulated itself from competition during a critical period of technological change. Developers that had relied on Facebook’s open-access policies were crushed by new limits on their ability to interoperate.’ (Federal Trade Commission 2021).

This kind of conduct is not straightforward for competition regulators to control. If it is legal for firms to charge for access, and legal for firms to offer free access, then competition regulators alleging a ‘bait and switch’ may find themselves having to persuade a court that the pricing change is a form of anti-competitive conduct. In a complex market where true costs are difficult to discern, this may prove challenging for the competition regulator.

Painting the Future

What are the answers to these challenges? I don’t have all of them, and I’m afraid that ChatGPT doesn’t either. Competition regulators have already flagged their concerns that AI may raise a myriad of issues, including bundling, self-preferencing and collusion. AI could also turbocharge fraud, enabling scammers to send personally tailored phishing messages, produce fake websites, overwhelm sites with fake consumer reviews, or create deepfake videos and voice clones (Khan 2023). A seven-country survey found that one in ten respondents had been targeted by an AI voice scam, with cybercriminals using snippets of online audio to trick people into thinking that their child or grandchild is in trouble, and needs money urgently transferred (McAfee 2023).

The design of new safeguards takes into account a range of considerations. Economist Joshua Gans has cautioned governments to bring a competition lens to the regulation of AI, pointing out that the largest AI companies may welcome onerous licensing and regulatory barriers if they act as a competitive moat to keep new entrants at bay (Gans 2023). At the same time, low levels of trust in the quality and safety of AI services can be barriers to businesses adopting these technologies. Governments around the world are considering how best to reduce known risks associated with AI products while fostering competitive industries.

In dealing with such complex challenges, I am pleased at the level of international collaboration between competition agencies, and between our domestic agencies, including work between the Australian Competition and Consumer Commission, the Australian Communications and Media Authority, the eSafety Commissioner and the Office of the Australian Information Commissioner on issues related to AI and broader technology challenges. It is also worth noting that the role of Australian competition and technology agencies will be somewhat different from regulators in countries where the main AI engines are being developed. 

Conclusion

After an era in which productivity has languished while inequality has worsened, artificial intelligence offers the potential for massive economic gains. For Australia, AI has the potential to turbocharge productivity. Most Australians work in the service sector, where tasks requiring information processing and written expression are ubiquitous. From customer support to computer programming, education to law, there is massive potential for AI to make people more effective at their jobs. And the benefits go beyond what shows up in GDP. AI can make the ideal Spotify playlist for your birthday, detect cancer earlier, devise a training program for your new sport, or play devil’s advocate when you’re developing an argument.

AI might also be an equalising force. For a struggling student, an AI tutor might provide questions at just the right level; allowing the student to stay engaged with learning where they might otherwise have dropped out. For a migrant with imperfect English, AI allows them to communicate at a fluency that would otherwise be impossible. In heath care, AI can help doctors and nurses do a better job in areas that are underserved by medical specialists, such as regional towns. In business, AI can help startups take on incumbents in a range of product and service markets.

But it’s not all upside. Many digital markets have started as fiercely competitive ecosystems, only to consolidate over time (Wu 2018). We should beware of incumbents asserting their right to train AI models on their own user data, while denying data access to competitors.

In this speech, I have identified five big challenges that AI poses for competition. Costly chips. Private data. Network effects. Immobile talent. And an ‘open-first, closed-later’ model. These are not just issues for our competition regulators, but also for competition reformers. Just as competition laws needed to be updated to deal with the misbehaviour of the oil titans and rail barons of nineteenth century America, so too we may need to make changes in Australian laws to address the challenges that AI poses.

In this case, we face a particular challenge because of the speed at which the technology is improving and being adopted. Like the steam engine and electricity, AI is a general-purpose technology – meaning that it can be adopted in many ways. Past general-purpose technologies took some time to affect our lives. In the case of electricity, it took decades before industrialists rearranged factories to take advantage of it.

But AI is a different kind of general-purpose technology. Because it uses normal language, AI comes purpose built for people to use it in all kinds of different contexts. After I’d played with AI for a few hours, it seemed natural to use it to choose a scenic stop-off during our recent family holiday. When my parliamentary colleague Julian Hill organised a dinner to discuss AI, I happily replied with an AI-generated limerick.

AI’s ease of use is significantly increasing take-up, and in turn placing pressure on governments in terms of how we think about AI in many different contexts, from children doing homework to doctors keeping patient notes.

If AI is effectively a new factor of production, then we need to think about the extent to which it is amenable to competition. In other industries, competition has arisen because key staff left to start a competing company, or because it made sense for another firm to operate in a different geographic area, or because consumers desired a variant on the initial product.

But if AI is learning from itself, if it is global, and if it is general, then these features may not arise. In that scenario, consolidation may be more likely than competition. This has implications for geopolitics as well as productivity.

Within Treasury, our competition taskforce is engaged with these questions. Bringing together economists and lawyers, while drawing on academic expertise from Australia and overseas, they are applying a forensic energy to the process of competition reform. We are also considering the Australian Competition and Consumer Commission’s recommendations to strengthen competition and consumer protection in markets for digital platform services, which are increasingly integrating AI.

Getting AI safeguards right won’t be as straightforward as getting Claude to create a dinner party menu for a group of gluten-free vegans who are trying the Atkins diet. With a technology that is moving this fast, it’s unlikely we’ll find a solution that is perfect the first time. But with AI having huge potential to transform our society and economy, it’s critical to be considering its competitive aspects. Only by doing so will we ensure that Australia reaps the greatest social and economic benefits of AI.

References

Clark, D 2023, ‘How Nvidia Build a Competitive Moat Around A.I. Chips’ The New York Times, 21 August, viewed 8 September 2023, <https://www.nytimes.com/2023/08/21/technology/nvidia-ai-chips-gpu.html>.

Cockburn, I, R. Henderson and S. Stern 2019, ‘The Impact of Artificial Intelligence on Innovation: An Exploratory Analysis’ in A. Agrawal, J. Gans and A. Goldfarb (eds) The Economics of Artificial Intelligence: An Agenda, University of Chicago Press, Chicago, 115-148.

Duarte, F 2023 ‘Number of ChatGPT Users (2023)’. Exploding Topics, 13 July, web blog post, viewed 9 September 2023, <https://explodingtopics.com/blog/chatgpt-users>.

e61 Institute (e61) 2023, ‘The ghosts of employers’ past: how prevalent are non-compete clauses in Australia?‘, viewed 9 September 2023, <https://e61.in/the-ghosts-of-employers-past-how-prevalent-are-non-compete-clauses-in-australia/>. 

The Economist 2020, ‘How ASML became chipmaking’s biggest monopoly’ 29 February, viewed 9 September 2023 <https:/www.economist.com/business/2020/02/29/how-asml-became-chipmakings-biggest-monopoly>.

The Economist, 2023, ‘AI is setting off a great scramble for data’, The Economist, 13 August <https://www.economist.com/business/2023/08/13/ai-is-setting-off-a-great-scramble-for-data>

The Ezra Klein Show 2023, audio podcast, The New York Times, New York, 21 March, accessed 9 September 2023, <https://www.nytimes.com/2023/03/21/opinion/ezra-klein-podcast-kelsey-piper.html>.

Federal Trade Commission 2021, FTC Alleges Facebook Resorted to Illegal Buy-or-Bury Scheme to Crush Competition After String of Failed Attempts to Innovate, media release, 19 August, Federal Trade Commission, viewed 9 September 2023, <https://www.ftc.gov/news-events/news/press-releases/2021/08/ftc-alleges-facebook-resorted-illegal-buy-or-bury-scheme-crush-competition-after-string-failed>.

Federal Trade Commission 2023a, FTC Proposes Rule to Ban Noncompete Clauses, Which Hurt Workers and Harm Competition, media release, 5 January, Federal Trade Commission, viewed 9 September 2023 <https://www.ftc.gov/news-events/news/press-releases/2023/01/ftc-proposes-rule-ban-noncompete-clauses-which-hurt-workers-harm-competition>.

Federal Trade Commission 2023b, Generative AI Raises Competition Concerns, Federal Trade Commission, web blog post, 29 June, viewed 9 September 2023 <https://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2023/06/generative-ai-raises-competition-concerns>.

Gans J, 2023, ‘They say AI is going to be the End of Us ... Again’, web blog post, viewed 9 September 2023 <https://joshuagans.substack.com/p/they-say-ai-is-going-to-be-the-end?utm_source=profile&utm_medium=reader2>.

Huet, E 2023, ‘What Happens When Sexting Chatbots Dump Their Human Lovers’, Bloomberg News, 22 March <https://www.bloomberg.com/news/articles/2023-03-22/replika-ai-causes-reddit-panic-after-chatbots-shift-from-sex>

Hutchins, J 1995, ‘The whisky was invisible, or Persistent myths of MT’, MT News International, 11 June, pp 17-18, viewed 9 September 2023, ACL Aclanthology database <https://aclanthology.org/www.mt-archive.info/90/MTNI-1995-Hutchins.pdf>.

Isaac, M 2023, ‘Reddit Wants to Get Paid for Helping to Teach Big A.I. Systems, The New York Times, 18 April, viewed 9 September 2023 <https://www.nytimes.com/2023/04/18/technology/reddit-ai-openai-google.html>.

Khan, L 2023, ‘We Must Regulate A.I. Here’s How’ The New York Times, 3 May, viewed 9 September 2023 <https://www.nytimes.com/2023/05/03/opinion/ai-lina-khan-ftc-technology.html>.

Liang, C 2023, ‘My A.I. Lover’, New York Times, 23 May. <https://www.nytimes.com/2023/05/23/opinion/ai-chatbot-relationships.html>

Mandala 2023, ‘AI Human Capital Investment Index’, viewed 9 September 2023 <https://mandalapartners.com/posts/ai-human-capital-investment-index>.

McAfee 2023, Beware the Artificial Impostor: A McAfee Cybersecurity Artificial Intelligence Report, McAfee, San Jose, CA. < https://www.mcafee.com/content/dam/consumer/en-us/resources/cybersecurity/artificial-intelligence/rp-beware-the-artificial-impostor-report.pdf>

McCarthy, J, Minsky M, Rochester N, Shannon C.E. 1955, A proposal for the Dartmouth Summer Research Project on Artificial Intelligence, viewed 9 September 2023, <http://www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html>.

Miller, C 2022, Chip War: The Fight for the World’s Most Critical Technology, Scribner, New York City, New York.

Oremus, W 2023, ‘AI chatbots lose money every time you use them. That is a problem’, The Washington Post, 5 June, viewed 9 September 2023 <https://www.washingtonpost.com/technology/2023/06/05/chatgpt-hidden-cost-gpu-compute/>.

Paris, F & Buchanan L 2023, ’35 Ways Real People Are Using A.I. Right Now’ The New York Times, 14 April, viewed 9 September <https://www.nytimes.com/interactive/2023/04/14/upshot/up-ai-uses.html>.

Roose, L 2022, ‘An A.I.-Generated Picture Won an Art Prize. Artists Aren’t Happy’ The New York Times, 2 September, viewed 9 September 2023 <https://www.nytimes.com/2022/09/02/technology/ai-artificial-intelligence-artists.html>.

Song, A 2019, ‘The Digital Entrepreneurial Ecosystem – a critique and reconfiguration’ Small Business Economics, vol. 53, iss. 5, pp. 569-590, viewed 9 September 2023, ResearchGate database <https://www.researchgate.net/publication/334709053_The_Digital_Entrepreneurial_Ecosystem-a_critique_and_reconfiguration>.

Wu, T 2018, The Curse of Bigness: Antitrust in the New Gilded Age, Columbia Global Reports, New York.

Yong-Jun, C 2023, ‘Music video for MAVE:’s debut song surpasses 10 million views’ Korea JoongAng Daily, 8 February, viewed 9 September 2023 <https://koreajoongangdaily.joins.com/2023/02/08/entertainment/kpop/Korea-Kpop-Metaverse/20230208165411002.html>.

* My thanks to Tori Barker and Toby Halligan for excellent drafting assistance. I am grateful to a range of colleagues, including John Asker, Andrew Charlton, Joshua Gans, Julian Hill and Jason McDonald for feedback on earlier drafts. No part of this speech was drafted by AI.


Showing 1 reaction

Please check your e-mail for a link to activate your account.
  • Georgia Thompson
    published this page in What's New 2023-09-20 09:58:12 +1000

Stay in touch

Subscribe to our monthly newsletter

Search



Cnr Gungahlin Pl and Efkarpidis Street, Gungahlin ACT 2912 | 02 6247 4396 | [email protected] | Authorised by A. Leigh MP, Australian Labor Party (ACT Branch), Canberra.