The first video games, developed in the 1960s, were not usually commercialised. They required mainframe computers to run and were not available to the general public. Commercial game development began in the '70s with the advent of first-generation video game consoles and early home computers like the Apple I. At that time, owing to low costs and low capabilities of computers, a lone programmer could develop a full and complete game. However, in the late '80s and '90s, ever-increasing computer processing power and heightened expectations from gamers made it difficult for a single person to produce a mainstream console or PC game. The average cost of producing a triple-A video game slowly rose, from US$1–4 million in 2000, to over $5 million in 2006, then to over $20 million by 2010[citation needed].
Mainstream commercial PC and console games are generally developed in phases: first, in pre-production, pitches, prototypes, and game design documents are written; if the idea is approved and the developer receives funding, then full-scale development begins. The development of a complete game usually involves a team of 20–100 individuals with various responsibilities, including designers, artists, programmers, and testers.