YouTube

FeatureDescription
LogoRed round-rectangular box with a white triangular “play” button inside and “YouTube” written in black. Used since 2017.
Type of BusinessSubsidiary
Type of SiteOnline video platform
FoundedFebruary 14, 2005 (19 years ago)
Headquarters901 Cherry Avenue, San Bruno, California, United States
Area ServedWorldwide (except in blocked countries)
OwnerGoogle LLC
FoundersSteve Chen, Chad Hurley, Jawed Karim
Key PeopleNeal Mohan (CEO), Chad Hurley (advisor)
IndustryInternet, Video hosting service
ProductsYouTube Kids, YouTube Music, YouTube Premium, YouTube Shorts, YouTube TV
Revenue$31.5 billion (2023)
Parent CompanyGoogle LLC (since 2006)
URLyoutube.com (localized domain names available)
AdvertisingGoogle AdSense
RegistrationOptional
Users2.7 billion monthly active users (January 2024)
LaunchedFebruary 14, 2005 (19 years ago)
Current StatusActive
Content LicenseUploader holds copyright (standard license); Creative Commons can be selected
Written InPython (core/API), C (through CPython), C++, Java (through Guice platform), Go, JavaScript (UI)

YouTube is an online platform for sharing videos owned by Google. It is accessible worldwide and was created on February 14, 2005, by Steve Chen, Chad Hurley and Jawed Karim who were all former employees of PayPal. The company is based in San Bruno, California.

YouTube is the second most visited website in the world, right after Google Search. It has over 2.5 billion monthly users who watch more than a billion hours of videos every day. As of May 2019, users were uploading more than 500 hours of content to YouTube every minute. By 2021, the platform hosted approximately 14 billion videos.

In October 2006, Google bought YouTube for $1.65 billion (about $2.31 billion in 2023). After the purchase, Google expanded YouTube’s way of making money. Originally, YouTube made money only from ads but Google added options like paid movies and exclusive content. They also introduced YouTube Premium, a paid subscription service that lets users watch videos without ads.

YouTube started using Google’s AdSense program which helped both YouTube and content creators earn more money. By 2022, YouTube’s yearly revenue from ads had grown to $29.2 billion, an increase of over $9 billion compared to 2020.

Since Google bought YouTube, the platform has grown beyond just its website. Now, it includes mobile apps, network TV and can connect with other platforms. YouTube offers a wide range of video categories like music videos, clips, news, short and feature films, songs, documentaries, movie trailers, teasers, TV spots, live streams, vlogs and more.

Most of the content on YouTube is created by individuals, often through collaborations between “YouTubers” and corporate sponsors. Many established media, news and entertainment companies have also started YouTube channels to reach a larger audience.

YouTube has had a huge impact on society, shaping popular culture, internet trends and creating multimillionaire celebrities. Despite its success, the platform faces criticism for various issues. People have accused YouTube of spreading misinformation, sharing copyrighted content without permission, violating user privacy, enabling censorship and putting child safety and wellbeing at risk. Additionally, YouTube has been criticized for inconsistent or incorrect enforcement of its guidelines.

History

Founding & Initial Growth (2005–2006)

We wanted to create a place where anyone with a video camera and a computer could create something that the whole world could watch. – Chad Hurley (Co-founder of YouTube)

YouTube was founded by Steve Chen, Chad Hurley and Jawed Karim. They were early employees at PayPal and made money when eBay bought the company. Hurley studied design at the Indiana University of Pennsylvania, while Chen and Karim studied computer science at the University of Illinois Urbana-Champaign.

A popular story in the media says Hurley and Chen came up with the idea for YouTube in early 2005 after struggling to share videos from a dinner party at Chen’s apartment in San Francisco. Although Karim didn’t attend the party and denied it happened, Chen acknowledged that the dinner party story was likely emphasized for marketing purposes because it was easy to understand and appealing.

The idea was to create a place where anyone could upload videos and share them with the world. – Jawed Karim (Co-founder of YouTube)

Jawed Karim said that the idea for YouTube was inspired by the Super Bowl XXXVIII halftime show controversy where Janet Jackson’s breast was briefly exposed by Justin Timberlake. Karim had trouble finding video clips of this incident and the 2004 Indian Ocean Tsunami online which sparked the idea for a video-sharing site.

Hurley and Chen initially thought of YouTube as a video-based online dating service, inspired by the website Hot or Not. They even posted ads on Craigslist, offering $100 to attractive women who uploaded videos of themselves to YouTube. However, they struggled to get enough dating videos, so they decided to allow uploads of any type of video instead.

YouTube started as a technology startup funded by venture capital. Between November 2005 and April 2006, the company raised money from several investors, with Sequoia Capital and Artis Capital Management being the largest contributors. YouTube’s first headquarters were located above a pizzeria and a Japanese restaurant in San Mateo, California.

In February 2005, the company launched the website www.youtube.com. The first video, titled “Me at the zoo,” was uploaded on April 23, 2005, featuring co-founder Jawed Karim at the San Diego Zoo and it is still available on the site.

In May 2005, YouTube launched a public beta version. By November, a Nike ad featuring Ronaldinho became the first video to reach one million views. The site officially launched on December 15, 2005, and was getting 8 million views a day by then. At that time, video clips were limited to 100 megabytes which was about 30 seconds of footage.

YouTube was not the first video-sharing site; Vimeo launched in November 2004, but it remained a side project for its developers from CollegeHumor. The week YouTube launched, NBC-Universal’s Saturday Night Live aired a skit called “Lazy Sunday” by The Lonely Island. This skit became an early viral video, boosting YouTube’s popularity and helping establish it as an important website.

Unofficial uploads of “Lazy Sunday” on YouTube gained over five million views by February 2006 before NBCUniversal requested their removal due to copyright concerns. Despite being taken down, these uploads helped increase YouTube’s popularity and led to more third-party content being shared.

YouTube’s growth was rapid; by July 2006, more than 65,000 new videos were being uploaded daily and the site was receiving 100 million video views per day.

The name www.youtube.com caused problems for a website called www.utube.com, owned by Universal Tube & Rollform Equipment. In November 2006, Universal Tube filed a lawsuit against YouTube because their site was frequently overloaded by people trying to reach YouTube. As a result, Universal Tube changed its website to www.utubeonline.com.

The original YouTube logo was used from the site’s launch until 2007. It returned briefly in 2008 before being removed again in 2010. A version of this logo without the “Broadcast Yourself” slogan was used until 2011.

“Broadcast Yourself” Era (2006–2013)

On October 9, 2006, Google announced its purchase of YouTube for $1.65 billion in Google stock, finalizing the deal on November 13, 2006. This acquisition sparked increased interest in video-sharing sites. To stand out from YouTube, Vimeo’s owner, IAC, focused on supporting content creators. Around this time, YouTube adopted the slogan “Broadcast Yourself.”

YouTube experienced rapid growth. In 2007, The Daily Telegraph reported that YouTube used as much bandwidth as the entire Internet did in 2000. By 2010, YouTube had about 43% market share and over 14 billion video views according to comScore. That year, the site simplified its interface to encourage users to spend more time on it.

By 2011, users were watching over three billion videos daily, with 48 hours of new videos uploaded every minute. However, most views came from a small number of videos; 30% of videos accounted for 99% of views. In 2011, YouTube updated its interface and introduced a new logo with a darker red shade.

In 2013, YouTube rolled out another interface change to unify the user experience across desktop, TV, and mobile. By then, over 100 hours of video were uploaded every minute increasing to 300 hours by November 2014.

During this period, YouTube underwent several changes. In October 2006, the company moved to a new office in San Bruno, California. In October 2010, Chad Hurley stepped down as CEO to take an advisory role and Salar Kamangar became the new head of the company.

In December 2009, YouTube partnered with Vevo. In April 2010, Lady Gaga’s “Bad Romance” became the most viewed video, reaching 200 million views by May 9, 2010.

In 2011, YouTube faced a significant lawsuit from Viacom International accusing YouTube of copyright infringement. This lawsuit nearly led to the site’s shutdown. However, in 2012, the United States Court of Appeals for the Second Circuit ruled that YouTube was not liable and YouTube won the case.

Susan Wojcicki & Mainstream Expansion (2014–2018)

In February 2014, Susan Wojcicki became the CEO of YouTube. In January 2016, YouTube expanded its San Bruno headquarters by buying an office park for $215 million, providing 554,000 square feet of space for up to 2,800 employees. In August 2017, YouTube launched a redesign of its user interface based on Material Design and introduced a new logo centered around the play button emblem.

During this period, YouTube explored various ways to generate revenue beyond ads. In 2013, YouTube started a test program allowing content creators to offer paid subscription channels. This program was discontinued in January 2018 and relaunched in June with $4.99 channel subscriptions. These subscriptions complemented the existing Super Chat feature, introduced in 2017, which allows viewers to donate between $1 and $500 to have their comments highlighted.

In 2014, YouTube announced “Music Key,” a subscription service offering ad-free streaming of music content on YouTube, bundled with Google Play Music. In 2015, YouTube introduced YouTube Red, a premium service offering ad-free access to all content, original series and films by YouTube personalities and background playback on mobile devices. YouTube also launched YouTube Music, an app focused on streaming and discovering music content on the platform.

YouTube created products aimed at specific audiences. In 2015, it launched YouTube Kids, a mobile app with a simplified user interface, curated age-appropriate content and parental controls. That same year, YouTube introduced YouTube Gaming a platform for video gaming videos and live streaming designed to compete with Twitch.

On April 3, 2018, a shooting occurred at YouTube’s headquarters in San Bruno, California, wounding four people and resulting in the death of the shooter.

Recent History (2019–Present)

By February 2017, people were watching one billion hours of YouTube videos every day and 400 hours of video were uploaded every minute. By 2019, this increased to over 500 hours per minute.

During the COVID-19 pandemic, YouTube usage soared as people stayed home. One data firm estimated that YouTube accounted for 15% of all internet traffic, double its pre-pandemic level. In response to EU officials’ requests to reduce bandwidth usage to ensure medical entities had enough, YouTube and Netflix lowered streaming quality for at least 30 days to cut their bandwidth use by 25%. YouTube later extended this measure worldwide, working with governments and network operators to minimize system stress during the pandemic.

In 2018, YouTube faced a complaint for violating the Children’s Online Privacy Protection Act (COPPA). The FTC fined YouTube $170 million for collecting personal information from children under 13. YouTube was also required to create systems to improve children’s privacy. After criticism of these systems YouTube began treating all videos marked as “made for kids” under COPPA rules starting January 6, 2020. To further enhance child safety, YouTube introduced a supervised mode for tweens in 2021 and launched YouTube Shorts to compete with TikTok.

YouTube also had disputes with other tech companies. In 2018 and 2019, there was no YouTube app for Amazon Fire products due to disagreements. In 2020, Roku removed the YouTube TV app from its store after failing to reach an agreement with YouTube.

In November 2021, YouTube stopped showing dislike counts on videos. They claimed their research showed the dislike feature was often used for cyberbullying and brigading. Some users appreciated the move for discouraging trolls but others felt it would make it harder to identify clickbait or unhelpful videos. Critics argued that there were already features to limit bullying.

YouTube co-founder Jawed Karim called the change “a stupid idea” suggesting that the real reason for the change was not publicly disclosed. He believed that users’ ability to identify harmful content through dislikes was essential calling it “the wisdom of the crowds.”

In response, software developer Dmitry Selivanov created “Return YouTube Dislike” a browser extension for Chrome and Firefox that allows users to see dislike counts again.

In a letter published on January 25, 2022, then-YouTube CEO Susan Wojcicki acknowledged the controversy but stood by the decision, claiming it reduced dislike attacks.

In 2022, YouTube tested a new ad format where users watching longer videos on TVs were shown a long chain of un-skippable ads at the beginning. This experiment aimed to consolidate all ads at the start of the video. However, due to public backlash YouTube ended the experiment on September 19, 2022.

In October 2022, YouTube introduced customizable user handles (e.g., @MrBeast6000) that would also serve as channel URLs.

On February 16, 2023, Susan Wojcicki announced she would step down as CEO, with Neal Mohan named as her successor. Wojcicki will continue in an advisory role for Google and its parent company, Alphabet.

In late October 2023, YouTube started cracking down on ad blockers. Users with ad blockers may see a pop-up warning that their video player will be blocked after three videos. They are prompted to either allow ads or subscribe to YouTube Premium for an ad-free experience. YouTube stated that using ad blockers violates its terms of service.

In April 2024, YouTube announced it would enforce stricter measures on third-party apps that block ads such as NewPipe, to ensure compliance with YouTube’s Terms of Service.

Senior Leadership

YouTube has had a CEO since its founding in 2005. Chad Hurley was the first CEO, leading until 2010. After Google acquired YouTube, Salar Kamangar took over as CEO and served until 2014. Susan Wojcicki then became CEO until she resigned in 2023. The current CEO, Neal Mohan, was appointed on February 16, 2023.

Features

Video Technology

YouTube primarily uses the VP9 and H.264/MPEG-4 AVC video codecs and the Dynamic Adaptive Streaming over HTTP (DASH) protocol. For low bandwidth connections, it also provides MPEG-4 Part 2 streams within 3GP containers. By January 2019, YouTube started rolling out videos in the AV1 format. In 2021, there were reports that YouTube was considering requiring AV1 in streaming hardware to decrease bandwidth and improve quality. Videos are usually streamed with Opus and AAC audio codecs.

When YouTube launched in 2005, users needed the Adobe Flash Player plug-in to view videos on a computer. In January 2010, YouTube introduced an experimental version of the site using HTML video allowing users to watch videos without needing Flash Player or any other plug-in. On January 27, 2015, YouTube announced that videos would now play using HTML as the default method on compatible browsers. HTML video streams use Dynamic Adaptive Streaming over HTTP (DASH) which optimizes the bitrate and quality based on the available network.

YouTube allows videos to be played at lower resolutions starting at 144p, to ensure smoother playback in areas with limited internet speeds and to help users save on cellular data. The resolution can adjust automatically based on the connection speed or be set manually.

From 2008 to 2017, users could add “annotations” to their videos such as pop-up text messages and hyperlinks, making videos interactive. However, by 2019, all annotations were removed, affecting videos that relied on this feature. YouTube replaced annotations with standardized widgets like “end screens” which display customizable thumbnails for specified videos near the end of a video.

In 2018, YouTube became an official registry for the International Standard Name Identifier (ISNI). The platform announced plans to create ISNI identifiers to uniquely identify the musicians featured in its videos.

Users can verify their YouTube account, usually through a mobile phone, to upload videos up to 12 hours long and create live streams. Users with a good track record and sufficient channel history also gain these features.

When YouTube launched in 2005, longer videos could be uploaded. However, a 10-minute limit was introduced in March 2006 because many longer videos were unauthorized uploads of TV shows and films. This limit was increased to 15 minutes in July 2010. Videos can be up to 256 GB in size or 12 hours long, whichever is less.

As of 2021, YouTube offers automatic closed captions in 13 languages using speech recognition technology which can be machine-translated during playback.

YouTube allows manual closed captioning through its creator studio. It used to have a ‘Community Captions’ feature where viewers could submit captions for approval but this was discontinued in September 2020.

YouTube accepts various common video formats including MP4, Matroska, FLV, AVI, WebM, 3GP, MPEG-PS and QuickTime File Format. It also accepts some intermediate formats used in professional video editing like ProRes. YouTube provides recommended encoding settings for uploads.

Each video on YouTube is identified by an eleven-character alphanumeric string in its URL which can include letters, digits, underscores (_) and dashes (-).

In 2018, YouTube introduced a feature called Premiere which notifies users when a video will be available for the first time, similar to a live stream but with a prerecorded video. At the scheduled time, the video is aired as a live broadcast with a two-minute countdown. Premieres can also be started immediately.

Quality & Formats

YouTube originally offered videos at a resolution of 320×240 pixels using the Sorenson Spark codec and mono MP3 audio. In June 2007, YouTube added an option to watch videos in 3GP format on mobile phones. In March 2008, a high-quality mode increased the resolution to 480×360 pixels.

In December 2008, YouTube introduced 720p HD support and changed the player to a widescreen 16:9 aspect ratio, switching to H.264/MPEG-4 AVC as the default video format. In November 2009, 1080p HD support was added. By July 2010, YouTube launched videos in 4K format with a resolution of up to 4096×3072 pixels and also added support for 2160p UHD at 3840×2160 pixels.

In June 2014, YouTube began supporting high frame rate videos up to 60 frames per second, enhancing motion-intensive videos like video game footage. By June 2015, 8K resolution support was added, allowing videos to play at 7680×4320 pixels. In November 2016, HDR video support was introduced, using either hybrid log-gamma (HLG) or perceptual quantizer (PQ) and the Rec. 2020 color space.

YouTube offers videos in various quality levels. Viewers can indirectly influence the video quality. On mobile apps, users can choose between:

  • Auto: Adjusts resolution based on internet connection.
  • High Picture Quality: Prioritizes high-quality video.
  • Data Saver: Sacrifices video quality to save data.
  • Advanced: Allows users to select a specific resolution.

On desktop, users can choose between “Auto” and specific resolutions. However, viewers cannot directly choose a higher bitrate for any selected resolution.

Since 2009, YouTube has supported 3D videos. In 2015, it began supporting 360-degree videos. By April 2016, YouTube allowed live streaming of 360° videos and regular videos up to 1440p and by November 2016, both were supported up to 4K (2160p) resolution. Due to the limited number of users watching beyond 90 degrees, YouTube introduced VR180, an easier-to-produce stereoscopic format that allows users to watch any video using virtual reality headsets.

During the COVID-19 pandemic, YouTube temporarily reduced video quality due to increased viewership. In 2021, YouTube developed its own chip, called “Argos” to help encode higher resolution videos.

In April 2023, YouTube introduced an enhanced bitrate “1080p Premium” option for YouTube Premium subscribers on iOS which became available on desktop platforms in August 2023.

YouTube also allows some older videos uploaded in poor quality to be upgraded. For example, YouTube partnered with Universal Music Group to remaster 1,000 music videos.

Live Streaming

YouTube started experimenting with live streaming early on including events like YouTube Live in 2008, a U2 concert in 2009, and a Q&A session with US President Barack Obama in February 2010. Initially, these tests used third-party technology but by September 2010, YouTube began testing its own live streaming system.

In April 2011, YouTube launched YouTube Live, initially limited to select partners. It was used to broadcast events like the 2012 Olympics in London. In October 2012, over 8 million people watched Felix Baumgartner’s space jump live on YouTube.

In May 2013, YouTube allowed verified users with at least 1,000 subscribers to create live streams. This limit was reduced to 100 subscribers in August 2013, and by December 2013, there were no subscriber limits. In February 2017, live streaming was added to the YouTube mobile app, initially for users with at least 10,000 subscribers but this was later reduced to 100 subscribers. Live streams support HDR, can reach up to 4K resolution at 60 fps and also support 360° video.

User Features

Comment System

Most YouTube videos allow users to leave comments. These comments have often been criticized for their negative content. In 2006, Time praised Web 2.0 for enabling large-scale community and collaboration but noted that some YouTube comments “harness the stupidity of crowds” with poor spelling, obscenities and hatred.

The Guardian in 2009 described YouTube comments as often juvenile, aggressive, misspelled, sexist and homophobic. Comments range from raging at the video content to providing overly detailed descriptions followed by “LOL,” with occasional bursts of wit.

In September 2008, The Daily Telegraph noted that YouTube was “notorious” for having some of the most confrontational and poorly written comments on the internet. They reported on a new software called YouTube Comment Snob which blocks rude and illiterate posts.

In April 2012, The Huffington Post observed that finding comments on YouTube that are “offensive, stupid and crass” is easy and common.

On November 6, 2013, Google changed YouTube’s comment system to require all users to use a Google+ account for commenting. The aim was to give creators more power to moderate and block comments, improving their quality and tone. The new system also restored the ability to include URLs in comments which had been removed due to abuse.

YouTube co-founder Jawed Karim criticized the change by asking, “why the fuck do I need a google+ account to comment on a video?” on his channel. The official announcement received over 20,000 “thumbs down” votes and more than 32,000 comments in two days.

Chase Melvin, writing for Newsday’s Silicon Island blog, noted that Google+ was not as popular as Facebook and was being forced on YouTube users who didn’t want to lose their ability to comment. He highlighted the widespread backlash against the new comment system across discussion forums.

User complaints about the new comment system might be justified but the idea of revamping the old system had its merits. The old system allowed crude, misogynistic and racially-charged comments without much moderation. Any attempt to curb harmful comments is worth trying. Although the new system wasn’t perfect, Google should be praised for trying to reduce the damage caused by angry YouTubers hiding behind anonymity.

On July 27, 2015, Google announced it would no longer require a Google+ account to post comments on YouTube. On November 3, 2016, YouTube introduced a trial where video creators could approve, hide or report comments based on an algorithm detecting offensive content. Creators could also manage comments with links or hashtags to combat spam and allow other users to help moderate comments.

In December 2020, YouTube introduced a feature that warns users when they post comments that “may be offensive to others.”

Community

On September 13, 2016, YouTube launched a public beta of Community, a feature allowing users to post text, images (including GIFs), live videos and more on a separate “Community” tab on their channel. Before the release, YouTube consulted with several creators including Vlogbrothers, AsapScience, Lilly Singh, The Game Theorists and others, to gather suggestions for useful tools.

Once officially released, the Community post feature was automatically activated for channels with a certain number of subscribers. This threshold has been lowered over time from 10,000 subscribers to 1,500, then to 1,000, and finally to 500 subscribers.

When the Community tab is enabled for a channel, the channel’s previous discussions (formerly known as channel comments) are permanently deleted instead of being migrated or coexisting with the new feature.

TestTube

YouTube used to have an area called TestTube for accessing experimental features. For example, in October 2009, they introduced a comment search feature under /comment_search, but it was later removed. In the same year, YouTube Feather was launched as a “lightweight” version of the site for countries with slow internet speeds.

After transitioning to the Polymer layout, TestTube was disabled and its URL now redirects to video playback settings. TestTube was replaced by a new system where users need to be premium members to enable or disable experimental features.

Content Accessibility

YouTube allows users to view its videos on external web pages. Each video has a piece of HTML code that can be used to embed it on any website. This feature is commonly used to embed videos on social networking sites and blogs.

Users can also post a “video response” to discuss, be inspired by, or relate to another user’s video. Each YouTube video has an eleven-character identifier, allowing for around 73.8 quintillion unique IDs.

On August 27, 2013, YouTube removed the video response feature due to low usage. Video owners can disable embedding, rating, commenting and response posting.

YouTube typically does not provide download links for videos, intending them to be viewed on its site. A few videos can be downloaded as MP4 files. Many third-party websites, apps and browser plug-ins allow users to download YouTube videos.

In February 2009, YouTube tested a service allowing some partners to offer video downloads for free or for a fee through Google Checkout. In June 2012, Google sent cease and desist letters to several websites offering online download and conversion of YouTube videos, leading some, like Zamzar, to remove this feature.

Users retain copyright of their work under the default Standard YouTube License but can grant usage rights under any public copyright license they choose. Since July 2012, users can select a Creative Commons attribution license, allowing others to reuse and remix their material.

Platforms

Most modern smartphones can access YouTube videos either through an app or an optimized website. YouTube Mobile launched in June 2007, using RTSP streaming but not all videos are available on the mobile version.

Since June 2007, YouTube videos have been available on various Apple products, requiring the content to be converted to Apple’s preferred H.264 video format. This allowed viewing on devices like Apple TV, iPod Touch and iPhone.

In July 2010, the mobile version of YouTube was relaunched using HTML video, optimized for touch screens and removing the need for Adobe Flash Player. The mobile version is also available as an app for Android.

In September 2012, YouTube launched its first app for the iPhone after it was no longer a preloaded app in iPhone 5 and iOS 6. Between April and June 2013, YouTube was the third-most used app, used by 35% of smartphone users.

In July 2008, a TiVo service update allowed the system to search and play YouTube videos.

In January 2009, YouTube launched “YouTube for TV,” a version of the website designed for set-top boxes and TV-based media devices with web browsers. It initially supported the PlayStation 3 and Wii game consoles.

In June 2009, YouTube introduced YouTube XL, a simplified interface for viewing on standard television screens. YouTube is also available as an app on Xbox Live.

On November 15, 2012, Google launched an official app for the Wii, allowing users to watch YouTube videos from the Wii channel. An app was available for Wii U and Nintendo 3DS but was discontinued in August 2019. Videos can still be viewed on the Wii U Internet Browser using HTML video.

YouTube became available on the Roku player on December 17, 2013 and on the Sony PlayStation 4 in October 2014.

In November 2018, YouTube launched as a downloadable app for the Nintendo Switch.

International & Localization

In its early years, Google faced criticism for promoting US values by prioritizing English over other languages. On June 19, 2007, Google CEO Eric Schmidt announced YouTube localization at a conference in Paris. The aim was to customize the YouTube experience by country, including country-specific comments, metrics and video rankings. YouTube’s localization began rolling out that year.

A 2015 report showed YouTube’s localization efforts were continuing and expanding. In February 2023, YouTube allowed uploading a single video in multiple languages. Before this, creators had to launch separate channels for each language and upload dubbed versions of their videos to those channels. The new multi-language dub tracks feature was praised by creators like MrBeast as a “giant win” leading many to switch from separate channels to uploading dubbed versions on their main channel.

YouTube Localization by Country

As of 2024, YouTube has localized versions available in 104 countries and one territory (Hong Kong), along with a worldwide version. YouTube continues to expand its localized versions to more countries and regions.

If YouTube cannot identify a user’s specific country or region based on their IP address, it defaults to the United States. However, YouTube provides language and content preferences for all accessible countries, regions and languages.

YouTube suggests a local version based on the user’s IP address. Sometimes, users may see the message “This video is not available in your country” due to copyright restrictions or inappropriate content.

The YouTube interface is available in 76 languages including some without local channel versions such as Amharic, Albanian, Armenian, Burmese, Haitian Creole, Kyrgyz, Malagasy, Mongolian, Persian, Samoan, Somali and Uzbek.

Access to YouTube was blocked in Turkey from 2008 to 2010 due to videos deemed insulting to Mustafa Kemal Atatürk and offensive to Muslims. In October 2012, a local version of YouTube (youtube.com.tr) was launched in Turkey, subject to Turkish content regulations.

In March 2009, a dispute between YouTube and the British royalty collection agency PRS for Music led to the blocking of premium music videos in the UK. The dispute was resolved in September 2009. In April 2009, a similar dispute led to the removal of premium music videos in Germany.

Videos

In January 2012, it was estimated that visitors spent an average of 15 minutes a day on YouTube, compared to the four or five hours a day a typical US citizen spent watching TV. By 2017, viewers were watching YouTube on mobile devices for over an hour daily.

In December 2012, two billion views were removed from Universal and Sony music videos on YouTube. The Daily Dot claimed this was due to using automated processes to inflate view counts, violating YouTube’s terms of service. However, Billboard reported that the views were moved to Vevo because the videos were no longer active on YouTube.

On August 5, 2015, YouTube fixed the issue that caused view counts to freeze at “301” (later “301+”) until the actual count was verified. Now, YouTube view counts update in real time.

Since September 2019, YouTube abbreviates subscriber counts, showing only the first three digits publicly. This affects third-party real-time indicators like Social Blade. Exact subscriber counts are still available to channel operators in YouTube Studio.

On November 11, 2021, YouTube announced it would start hiding dislike counts on videos after testing this change in March. The decision was made to protect smaller creators from dislike brigading and harassment. Creators can still see the number of likes and dislikes in the YouTube Studio dashboard.

Copyright Issues

YouTube has faced many challenges and criticisms regarding copyright issues. The site’s first viral video, “Lazy Sunday” had to be taken down due to copyright concerns. When uploading a video, users are reminded not to violate copyright laws but many unauthorized clips still appear on YouTube.

YouTube does not review videos before they are posted. Copyright holders must issue a DMCA takedown notice under the Online Copyright Infringement Liability Limitation Act. A successful complaint results in a YouTube copyright strike. If a user receives three copyright strikes, their account and all uploaded videos are deleted.

From 2007 to 2009, organizations like Viacom, Mediaset and the English Premier League sued YouTube, claiming it did too little to prevent copyrighted material from being uploaded.

In August 2008, a US court ruled in Lenz v. Universal Music Corp. that copyright holders must consider whether a post reflects fair use before requesting its removal. In November 2015, YouTube’s owner Google announced they would help cover legal costs in select cases where fair use defenses apply.

In 2011, professional singer Matt Smith sued Summit Entertainment for wrongful copyright takedown notices on YouTube in the case of Smith v. Summit Entertainment LLC. He had seven causes of action and four were ruled in his favor.

In April 2012, a Hamburg court ruled that YouTube could be held responsible for copyrighted material posted by users. On November 1, 2016, YouTube resolved a dispute with GEMA, allowing ads on videos with content protected by GEMA using Google’s Content ID system.

In April 2013, it was reported that Universal Music Group (UMG) and YouTube have an agreement that prevents content blocked by UMG from being restored, even if the uploader files a DMCA counter-notice.

As part of YouTube Music, Universal and YouTube signed an agreement in 2017, followed by similar agreements with other major labels. These agreements allowed the company to earn advertising revenue when its music was played on YouTube.

By 2019, creators were experiencing videos being taken down or demonetized when Content ID identified even short segments of copyrighted music in a much longer video. Enforcement varied depending on the record label and experts noted that some of these clips qualified for fair use.

Content ID

In June 2007, YouTube began testing an automatic system to detect uploaded videos that infringe on copyright. Google CEO Eric Schmidt saw this system as essential for resolving lawsuits like the one from Viacom which claimed YouTube profited from unauthorized content.

Initially called “Video Identification” and later known as Content ID, the system creates ID files for copyrighted audio and video material and stores them in a database. When a video is uploaded, it is checked against this database. If a match is found, the video is flagged as a copyright violation. The content owner can then choose to block the video, track its viewing statistics or add advertisements to it.

In 2009, an independent test found that while YouTube’s Content ID system was effective at finding copyright violations in audio tracks, it wasn’t perfect. The automatic removal of content by Content ID has caused controversy since videos are not checked by a human for fair use. Users can dispute a Content ID decision by filling out a form.

Before 2016, videos weren’t monetized until disputes were resolved. Since April 2016, videos remain monetized during disputes, with the money going to the winner. Uploaders can remove disputed audio to monetize the video again using the “Video Manager.” The effectiveness of Content ID is one reason YouTube allowed some users to upload videos of unlimited length since December 2010.

Moderation & Offensive Content

YouTube has community guidelines to prevent the abuse of its features. It forbids uploading videos with defamation, pornography and content encouraging criminal activity. Generally prohibited material includes sexually explicit content, animal abuse, shock videos, copyrighted content without permission, hate speech, spam and predatory behavior.

Users can flag inappropriate videos and a YouTube employee will review them to determine if they violate the guidelines. Despite these rules, YouTube has faced criticism for:

  • Its recommendation algorithms promoting conspiracy theories and falsehoods
  • Hosting videos aimed at children but containing violent or sexually suggestive content with popular characters
  • Videos of minors attracting pedophilic comments
  • Inconsistent policies on what content can be monetized with ads

YouTube hires companies to provide content moderators who review flagged content to decide if it should be removed. In September 2020, a former content moderator filed a class-action lawsuit, claiming she developed PTSD after 18 months on the job. She stated she was often required to view graphic content for more than the recommended four hours per day.

The lawsuit alleges that YouTube’s contractors provided little to no training or mental health support for moderators. Prospective employees had to sign NDAs before being shown examples of the content they would review. The lawsuit also claims that YouTube censored all mentions of trauma in its internal forums and rejected recommendations to blur, reduce the size or make extremely graphic content monochrome, as advised by the National Center for Missing and Exploited Children, deeming it not a high priority.

YouTube has implemented a comprehensive policy to limit the spread of misinformation and fake news, particularly addressing technically manipulated videos.

Controversial content on YouTube has included Holocaust denial and videos about the Hillsborough disaster, where 96 football fans were crushed to death in 1989. In July 2008, the UK House of Commons Culture and Media Committee criticized YouTube’s content policing system, calling for proactive content review.

YouTube responded by highlighting its strict rules, a 24/7 review team and an easy reporting system for users.

In October 2010, U.S. Congressman Anthony Weiner urged YouTube to remove videos of imam Anwar al-Awlaki. YouTube removed some of these videos in November 2010, citing guideline violations. In December 2010, YouTube added the option to flag videos for containing terrorism content.

In 2018, YouTube introduced a system that automatically adds information boxes to videos its algorithms identify as potentially spreading conspiracy theories or fake news. These boxes contain content from reliable sources to inform users and minimize misinformation without impacting freedom of speech.

After the Notre-Dame fire on April 15, 2019, some videos of the fire were incorrectly flagged with an Encyclopædia Britannica article about 9/11 conspiracy theories. Users complained and YouTube officials apologized, explaining that their algorithms had misidentified the fire videos. They took steps to fix the issue.

On April 18, 2023, YouTube updated its Community Guidelines to address content related to eating disorders. The new rules prohibit content that might encourage harmful behavior such as severe calorie tracking and purging. However, videos promoting positive behavior, like recovery stories, are allowed if the user is logged in and over 18.

This policy was developed in collaboration with nonprofit organizations and the National Eating Disorder Association. Garth Graham, YouTube’s Global Head of Healthcare, told CNN that the goal is to provide a safe space for “community recovery and resources” while protecting viewers.

Homophobia & Transphobia

In August 2019, five leading LGBTQ+ content creators filed a federal lawsuit against YouTube. They claimed YouTube’s algorithms limit the visibility of their channels, impacting their revenue. The plaintiffs argued that the algorithms discourage content with words like “lesbian” or “gay” which are common in their videos. They also accused YouTube of abusing its dominant position in online video services.

In June 2022, Media Matters, a media watchdog group, reported an increase in homophobic and transphobic content on YouTube. This content included calling LGBT people “predators” and “groomers” and accusing them of being mentally ill. The report stated that such content appeared to violate YouTube’s hate speech policy.

Animal Torture

Starting in 2020, videos featuring animal cruelty on YouTube gained media attention. In late 2020, the animal welfare charity Lady Freethinker identified 2,053 videos on YouTube where animals were “deliberately harmed for entertainment or shown in severe distress or pain.”

In 2021, Lady Freethinker filed a lawsuit against YouTube, accusing it of breaching its contract by allowing these videos and failing to remove them when notified. YouTube responded by expanding its policy on animal abuse videos in 2021, removing hundreds of thousands of videos and terminating thousands of channels for violations.

In 2022, Google won the lawsuit, with a judge ruling that YouTube was protected by Section 230 of the Communications Decency Act which shields internet platforms from lawsuits based on user-posted content.

In 2023, YouTube reiterated that animal abuse has no place on their platform and that they are working to remove such content.

YouTube as a Tool to Promote Conspiracy Theories & Far-Right Content

YouTube has faced criticism for its algorithm which often promotes videos with conspiracy theories, falsehoods and incendiary fringe discourse. An investigation by The Wall Street Journal found that YouTube’s recommendations frequently lead users to channels with conspiracy theories, partisan viewpoints and misleading videos, even if they haven’t shown interest in such content. When users show a political bias in their viewing, YouTube tends to recommend videos that reinforce those biases, often with more extreme viewpoints.

When users search for political or scientific terms, YouTube’s search algorithms often prioritize hoaxes and conspiracy theories. After controversy arose from YouTube promoting falsehoods and conspiracy theories during the 2017 Las Vegas shooting, YouTube adjusted its algorithm to favor mainstream media sources. However, in 2018, it was reported that YouTube continued to promote fringe content about breaking news, such as conspiracy videos about Anthony Bourdain’s death.

In 2017, it was revealed that advertisements were being placed on extremist videos including those by rape apologists, anti-Semites and hate preachers who received ad payouts. After firms began to pull their ads from YouTube, the platform apologized and promised to give advertisers greater control over ad placements.

Far-right conspiracy theorist Alex Jones built a large audience on YouTube. In 2018, YouTube faced criticism for removing a video from Media Matters that compiled offensive statements made by Jones, citing violations of its harassment and bullying policies. However, on August 6, 2018, YouTube removed Alex Jones’ entire channel for content violations.

University of North Carolina professor Zeynep Tufekci called YouTube “The Great Radicalizer,” suggesting it is one of the most powerful radicalizing tools of the 21st century. Jonathan Albright from the Tow Center for Digital Journalism at Columbia University described YouTube as a “conspiracy ecosystem.”

In January 2019, YouTube introduced a policy in the United States to stop recommending videos that could misinform users in harmful ways. Examples include flat earth theories, miracle cures and 9/11 truther content. Initially, efforts to stop recommending borderline extremist videos that didn’t quite violate hate speech policies were rejected due to concerns about affecting viewer engagement.

In January 2019, YouTube announced measures to promote authoritative content and reduce borderline content and harmful misinformation. In June, YouTube said it would ban Holocaust denial and neo-Nazi content. It also blocked the neo-Nazi propaganda film “Europa: The Last Battle” from being uploaded.

Research studies have highlighted instances of misinformation on YouTube:

  • July 2019 Study: A study using the Tor Browser found that most videos related to climate change presented views contrary to the scientific consensus.
  • May 2023 Study: YouTube was found to be monetizing videos that included misinformation about climate change.
  • 2019 BBC Investigation: YouTube’s algorithm promoted health misinformation including fake cancer cures, in searches conducted in ten different languages.
  • Brazil: YouTube has been linked to spreading pseudoscientific health misinformation and promoting far-right fringe discourse and conspiracy theories.
  • Philippines: Numerous channels spread misinformation related to the 2022 Philippine elections.
  • Flat Earth Beliefs: Research showed that YouTube channels create an echo chamber that polarizes audiences by confirming preexisting beliefs.

Use Among White Supremacists

Before 2019, YouTube removed specific videos or channels related to supremacist content that violated its policies but did not have site-wide policies against hate speech.

After the March 2019 Christchurch mosque attacks, YouTube, Facebook and Twitter were criticized for not doing enough to moderate and control hate speech which was considered a factor in the attacks. These platforms were pressured to remove such content. YouTube’s chief product officer Neal Mohan explained that while videos like those from ISIS follow a specific format that can be detected by algorithms, general hate speech is harder to identify and remove without human intervention.

In May 2019, YouTube joined an initiative led by France and New Zealand, along with other countries and tech companies, to create tools and regulations to block online hate speech. These regulations would be enforced at the national level, with penalties for tech firms that failed to remove such content. The United States did not participate in this initiative.

On June 5, 2019, YouTube updated its terms of service to specifically ban videos that claim one group is superior to justify discrimination, segregation, or exclusion based on age, gender, race, caste, religion, sexual orientation or veteran status. This includes videos promoting Nazi ideology. YouTube also announced it would remove content denying well-documented violent events such as the Holocaust or the Sandy Hook Elementary shooting.

In October 2019, YouTube banned the main channel of Red Ice, a white supremacist multimedia company, for hate speech violations. The channel had about 330,000 subscribers. Red Ice and Lana Lokteff promoted a backup channel to bypass the ban but YouTube removed the backup channel a week later.

In June 2020, YouTube faced criticism for allowing white supremacist content while pledging $1 million to fight racial injustice. Later that month, YouTube banned several channels associated with white supremacy including those of Stefan Molyneux, David Duke and Richard B. Spencer, for violating hate speech policies. The ban coincided with Reddit’s ban on several hate speech sub-forums, including r/The Donald.

Handling of COVID-19 Pandemic & Other Misinformation

During the COVID-19 pandemic, YouTube removed videos linking 5G technology to the spread of the virus after such misinformation led to attacks on 5G towers in the UK.

In September 2021, YouTube expanded its policy to include videos spreading misinformation about any vaccines including those for measles and Hepatitis B, approved by local health authorities or the World Health Organization. The platform removed accounts of anti-vaccine campaigners like Robert F. Kennedy Jr. and Joseph Mercola.

YouTube also addressed non-medical misinformation. After the 2020 U.S. presidential election, it implemented policies to remove or label videos promoting election fraud claims. However, in June 2023, YouTube reversed this policy to allow for the open debate of political ideas, even those based on disproven assumptions.

In October 2021, Google and YouTube implemented policies to deny monetization to advertisers and content creators promoting climate change denial. This includes content calling climate change a hoax, denying global warming trends or denying human contributions to climate change.

In January 2024, the Center for Countering Digital Hate reported that climate change deniers were spreading other forms of misinformation not yet banned by YouTube such as claims that global warming is “beneficial or harmless” and undermining climate science.

In July 2022, YouTube announced policies to combat misinformation about abortion. This includes removing videos with unsafe abortion methods and misinformation about abortion safety.

Child Safety & Wellbeing

In 2017, there was a significant rise in videos related to children, partly due to the popularity of family vlogging and content creators shifting to family-friendly material. YouTube reported a 90% increase in time spent watching family vloggers.

However, this increase led to controversies about child safety. In Q2 2017, the owners of the popular channel FamilyOFive, known for “prank” videos involving their children, were accused of child abuse. Their videos were deleted and two of their children were removed from their custody.

A similar case occurred in 2019 when the owner of the channel Fantastic Adventures was accused of abusing her adopted children. Her videos were also deleted.

In 2017, YouTube faced criticism for showing inappropriate videos targeted at children, often featuring popular characters in violent, sexual or disturbing situations. Many of these videos appeared on YouTube Kids and garnered millions of views. This controversy was termed “Elsagate.”

On November 11, 2017, YouTube announced it was strengthening security to protect children from unsuitable content. Later that month, the company began mass deleting videos and channels that misused family-friendly characters. They also targeted channels showing children in inappropriate or dangerous activities under adult supervision. One significant removal was the channel Toy Freaks which had over 8.5 million subscribers and featured a father and his two daughters in unsettling situations. According to SocialBlade, Toy Freaks earned up to $11.2 million annually before its deletion in November 2017.

YouTube allows anonymous uploads, raising concerns even for seemingly child-friendly content. Some channels start with appropriate content but later include inappropriate material disguised for children. Additionally, many popular children’s channels have no identifiable owners, creating concerns about their intent.

One such channel, “Cocomelon” produced numerous animated videos for children. By 2019, it was earning up to $10 million a month in ad revenue and was one of the largest kid-friendly channels on YouTube. The ownership of Cocomelon was unclear beyond its ties to “Treasure Studio” an unknown entity, raising questions about its purpose. However, in February 2020, Bloomberg News confirmed and interviewed the small team of American owners who said their goal was to entertain children and avoid attention from outside investors.

The anonymity of these channels raises concerns about their intent and accountability. Critics like Josh Golin from the Campaign for a Commercial-Free Childhood and educational consultant Renée Chernow-O’Leary argue that these videos are designed to entertain without educational value, making parents worry about their children’s screen time. Genuine content creators find it difficult to compete with larger channels that can produce content faster and benefit from YouTube’s recommendation algorithms.

In January 2019, YouTube officially banned videos that include challenges encouraging acts with a high risk of severe physical harm, like the Tide Pod Challenge. They also banned videos featuring pranks that make victims believe they are in physical danger or cause emotional distress in children.

Sexualization of Children & Pedophilia

In November 2017, it was revealed that many videos featuring children—often uploaded by the children themselves—were attracting comments from pedophiles. These videos, showing innocent activities like playing with toys or performing gymnastics, were found through private YouTube playlists or specific keywords in Russian. Some of these videos also ended up on the dark web and forums used by pedophiles.

This controversy, along with concerns about “Elsagate” led several major advertisers to freeze their spending on YouTube. In December 2018, The Times discovered over 100 grooming cases where children were manipulated into sexually explicit behavior by strangers. After a reporter flagged these videos, YouTube removed half of them immediately and the rest after The Times contacted YouTube’s PR department.

In February 2019, YouTube vlogger Matt Watson discovered a “wormhole” in YouTube’s recommendation algorithm that directed users to videos featuring children, often attracting comments from sexual predators. These comments included timestamps highlighting compromising positions and indecent remarks. Some users re-uploaded these videos in unlisted form, monetizing them and spreading this network.

Following this controversy, YouTube deleted over 400 channels and tens of millions of comments. They also reported offending users to law enforcement and the National Center for Missing and Exploited Children. A YouTube spokesperson stated that any content endangering minors is abhorrent and against their policies and they are working to improve their systems to catch abuse more quickly.

Despite these actions, companies like AT&T, Disney, Dr. Oetker, Epic Games and Nestlé pulled their advertising from YouTube.

In response to predatory comments, YouTube began demonetizing and blocking ads on videos attracting such comments. This was a temporary measure while exploring other solutions.

YouTube also started flagging channels that predominantly feature children and preemptively disabling their comment sections. “Trusted partners” can request to re-enable comments but must then moderate them. This mainly targets videos of toddlers but videos of older children and teenagers may also be protected, especially if they involve actions like gymnastics that can be interpreted as sexual.

YouTube is working on a better system to remove comments on other channels that resemble those made by child predators.

YouTube attempted to flag videos containing the abbreviation “CP” (for child pornography), but this led to false positives. Videos related to the game Pokémon Go, which uses “CP” for “Combat Power” and Club Penguin were mistakenly flagged. YouTube apologized and reinstated the affected videos.

Additionally, online trolls have tried to get videos taken down by leaving comments similar to those made by child predators, particularly during the PewDiePie vs T-Series rivalry in early 2019. YouTube stated they only take action on videos that are likely to attract child predator activity, not just because of these comments.

In June 2019, The New York Times reported that users who watched erotic videos could be recommended seemingly harmless videos of children. In response, Senator Josh Hawley announced plans to introduce federal legislation to ban YouTube and other video-sharing sites from recommending videos that predominantly feature minors, except for professionally produced content like televised talent shows.

YouTube suggested potential plans to move all videos featuring children to the YouTube Kids site where they would have stricter control over recommendations. They are also considering major changes to the recommendation and auto-play systems on the main YouTube site.

Misogyny

In August 2022, the Center for Countering Digital Hate reported that harassment against women was thriving on YouTube. The report highlighted that channels promoting ideologies similar to men’s rights influencer Andrew Tate, who is banned from YouTube, were using the platform to grow their audience.

In his 2022 book “Like, Comment, Subscribe: Inside YouTube’s Chaotic Rise to World Domination” Bloomberg reporter Mark Bergen noted that many female content creators face harassment, bullying and stalking on the platform.

Russia

In 2021, YouTube removed two accounts linked to RT Deutsch, the German channel of Russia’s RT network, for violating COVID-19 policies. Russia threatened to ban YouTube after these deletions.

After Russia invaded Ukraine in 2022, YouTube removed all channels funded by the Russian state and expanded this to include ‘pro-Russian’ channels. In June 2022, YouTube deleted the War Gonzo channel run by Russian military blogger Semyon Pegov. In July 2023, YouTube removed the channel of British journalist Graham Phillips, who covered the War in Donbas since 2014.

In August 2023, a Moscow court fined Google 3 million rubles (about $35,000) for not deleting what it called “fake news about the war in Ukraine.”

April Fools Gags

YouTube featured April Fools pranks every year from 2008 to 2016:

  • 2008: All video links on the main page redirected to Rick Astley’s “Never Gonna Give You Up” a prank known as “rickrolling.”
  • 2009: Clicking on a video turned the entire page upside down which YouTube claimed was a “new layout.”
  • 2010: YouTube introduced “TEXTp” mode, rendering videos into ASCII art letters, joking it was to reduce bandwidth costs by $1 per second.
  • 2011: The site celebrated its “100th anniversary” with sepia-toned silent films in an early 1900s style including a parody of Keyboard Cat.
  • 2012: Clicking on a DVD image next to the site logo led to a video about ordering every YouTube video for home delivery on DVD.
  • 2013: YouTube teamed up with The Onion, claiming the site was a contest that had ended and would shut down for ten years. The video, featuring YouTube celebrities like Antoine Dodson, said only the winning video would be shown when YouTube returned in 2023. A 12-hour live stream announced the nominated videos.
  • 2014: YouTube claimed it created all viral video trends and previewed upcoming trends like “Clocking,” “Kissing Dad” and “Glub Glub Water Dance.”
  • 2015: YouTube added a music button to the video bar that played samples from “Sandstorm” by Darude.
  • 2016: YouTube introduced an option to watch every video in 360-degree mode with Snoop Dogg.

Services

YouTube Premium

YouTube Premium (formerly YouTube Red) is a subscription service offering ad-free streaming, original programming and background and offline video playback on mobile devices. It was initially announced on November 12, 2014, as “Music Key,” a subscription music streaming service intended to replace Google Play Music “All Access.”

On October 28, 2015, the service was relaunched as YouTube Red, providing ad-free streaming of all videos and exclusive original content. By November 2016, the service had 1.5 million subscribers, with an additional million on free trials. As of June 2017, the first season of YouTube Originals had received 250 million views.

YouTube TV

On February 28, 2017, YouTube announced YouTube TV, a subscription service for U.S. customers priced at $65 per month. Launched on April 5, 2017, in five major markets (New York City, Los Angeles, Chicago, Philadelphia and San Francisco) the service offers live streams of programming from major broadcast networks (ABC, CBS, The CW, Fox and NBC) and about 60 cable channels. These channels include Bravo, USA Network, Syfy, Disney Channel, CNN, Cartoon Network, E!, Fox Sports 1, Freeform, FX and ESPN, owned by companies like The Walt Disney Company, Paramount Global, Fox Corporation, NBCUniversal, Allen Media Group and Warner Bros. Discovery.

Subscribers can add premium cable channels like HBO (via a combined Max add-on), Cinemax, Showtime, Starz and MGM+, as well as other services like NFL Sunday Ticket, MLB.tv, NBA League Pass, Curiosity Stream and Fox Nation for an extra fee. They also have access to YouTube Premium original content.

In September 2022, YouTube TV started allowing customers to purchase most premium add-ons (except certain services like NBA League Pass and AMC+) without needing a base subscription.

YouTube Movies & TV

YouTube Movies & TV is a video-on-demand service offering movies and TV shows for purchase or rental. It also features a selection of free movies with ad breaks. In November 2018, YouTube began offering free movies, with selections changing monthly without announcement.

In March 2021, Google announced plans to phase out the Google Play Movies & TV app and migrate users to the YouTube app’s Movies & TV store. This change began on July 15 for Roku, Samsung, LG and Vizio smart TV users. Google Play Movies & TV officially shut down on January 17, 2024, with its web version migrating to YouTube, expanding the Movies & TV store for desktop users. Other functions were integrated into the Google TV service.

YouTube Primetime Channels

On November 1, 2022, YouTube launched Primetime Channels, a platform offering third-party subscription streaming add-ons sold individually through the YouTube website and app. This service competes with similar platforms by Apple, Prime Video and Roku. Add-ons can be purchased via the YouTube Movies & TV hub or the official YouTube channels of the services. YouTube TV subscribers can access these add-ons through the YouTube app and website. Initially, 34 streaming services, including Paramount+, Showtime, Starz, Epix, AMC+ and ViX+, were available.

NFL Sunday Ticket was added as a standalone add-on on August 16, 2023, as part of a broader deal with Google. The ad-free tier of Max was added on December 12, 2023 and YouTube TV combined its separate HBO and HBO Max add-ons into a single Max offering.

YouTube Music

On September 28, 2016, YouTube appointed Lyor Cohen, co-founder of 300 Entertainment and former Warner Music Group executive, as the Global Head of Music.

In early 2018, Cohen hinted at the launch of a new subscription music streaming service to compete with Spotify and Apple Music. On May 22, 2018, YouTube Music was launched.

YouTube Kids

YouTube Kids is a children’s video app developed by YouTube, a subsidiary of Google, created in response to parental and government concerns about content available to children. The app offers a child-friendly version with curated content, parental control features and filters for inappropriate videos based on age groups (under 13, 8, or 5).

Launched on February 15, 2015, for Android and iOS, it is now also available on LG, Samsung and Sony smart TVs, Android TV and Apple TV (since May 27, 2020). As of September 2019, YouTube Kids is available in 69 countries including Hong Kong and Macau and one province. A web-based version was launched on August 30, 2019.

YouTube Go

In September 2016, YouTube Go was announced as an Android app designed for easier access to YouTube on mobile devices in emerging markets. It allowed users to download and share videos, preview them, share downloaded videos via Bluetooth and offered better control over mobile data usage and video resolution.

YouTube Go launched in India in February 2017 and expanded to 14 other countries including Nigeria, Indonesia, Thailand, Malaysia, Vietnam, the Philippines, Kenya and South Africa, in November 2017. On February 1, 2018, it rolled out to 130 countries worldwide including Brazil, Mexico, Turkey and Iraq. Before shutting down, the app was available to around 60% of the world’s population. In May 2022, Google announced that YouTube Go would be shut down in August 2022.

YouTube Shorts

In September 2020, YouTube announced YouTube Shorts, a platform for 15-second videos similar to TikTok. Initially tested in India, it expanded to other countries, including the United States, by March 2021, with videos now up to 1 minute long. YouTube Shorts is integrated into the main YouTube app and offers built-in creative tools including adding licensed music to videos. The platform had its global beta launch in July 2021.

YouTube Stories

In 2018, YouTube started testing a feature called “YouTube Reels,” later renamed “YouTube Stories.” Similar to Instagram Stories and Snapchat Stories this feature was available only to creators with more than 10,000 subscribers and could only be used in the YouTube mobile app. On May 25, 2023, YouTube announced that it would shut down this feature on June 26, 2023.

YouTube VR

In November 2016, YouTube released YouTube VR, a version designed for VR devices, starting with Google’s Daydream platform on Android. In November 2018, it became available on the Oculus Store for the Oculus Go headset. YouTube VR has since been updated for compatibility with successive Quest devices and ported to Pico 4.

YouTube VR provides access to all YouTube videos, especially supporting 360° and 180° videos in 2D and 3D. It was updated for mixed-reality passthrough modes on VR headsets starting with the Oculus Quest. In April 2024, YouTube VR was updated to support 8K SDR video on Meta Quest 3.

Social Impact

Both individuals and large production companies have used YouTube to grow their audiences. Independent creators have built large followings with little cost, while traditional media celebrities joined YouTube to reach potentially larger audiences than television.

YouTube’s “Partner Program” allows video producers to earn significant income, with the top 500 partners each earning over $100,000 annually and the top ten channels earning between $2.5 million and $12 million. However, YouTube has also been seen as a promotional platform for music labels.

In 2013, Forbes’ Katheryn Thayer noted that digital-era artists need to create high-quality content that elicits reactions on YouTube and social media. That year, 2.5% of artists categorized as “mega,” “mainstream” and “mid-sized” received 90.3% of views on YouTube and Vevo.

By early 2013, Billboard began including YouTube streaming data in its Hot 100 and related genre charts calculations.

TED curator Chris Anderson highlighted the impact of YouTube on communication, comparing its significance to that of the Gutenberg press for writing. He suggested that online video could greatly accelerate scientific progress and create the biggest learning cycle in history.

In education, the Khan Academy started as YouTube video tutoring sessions and has grown into a massive educational platform, described by Forbes as “the largest school in the world.” This shows the potential of technology to disrupt traditional learning methods.

YouTube received a 2008 George Foster Peabody Award and was praised as a platform that promotes democracy. Unlike mainstream television, YouTube’s most popular channels feature a diverse range of minorities. A Pew Research Center study noted the rise of “visual journalism,” where citizen eyewitnesses and established news organizations share content, making YouTube an important news source.

YouTube has allowed people to engage directly with government such as in the 2007 CNN/YouTube presidential debates where ordinary people submitted questions to U.S. presidential candidates via video. This demonstrates how internet video is changing the political landscape.

During the Arab Spring (2010-2012), activists used social media for organizing protests: Facebook to schedule, Twitter to coordinate and YouTube to share their message with the world.

In 2012, the “Kony 2012” video posted on YouTube led to significant political action. More than a third of the U.S. Senate introduced a resolution condemning Joseph Kony just 16 days after the video was posted. Senator Lindsey Graham noted that the video did more to lead to Kony’s downfall than all other actions combined.

YouTube has also helped governments engage with citizens. In 2012, the White House’s official YouTube channel was the seventh top news producer on the platform. In 2013, a healthcare exchange commissioned a YouTube music video spoof by Obama impersonator Iman Crosson to encourage young Americans to enroll in Affordable Care Act health insurance.

In February 2014, President Obama met with leading YouTube content creators at the White House to promote Obamacare and find ways for the government to connect with the “YouTube Generation.” While YouTube allows presidents to directly connect with citizens, the media savvy of YouTube creators was seen as essential to engaging the platform’s audience effectively.

Some YouTube videos have directly impacted world events. For example, the “Innocence of Muslims” video in 2012 led to international protests and anti-American violence.

TED curator Chris Anderson noted that individuals sharing their skills on YouTube challenge others to improve, driving innovation and evolution in various fields. Journalist Virginia Heffernan from The New York Times stated that such videos have significant implications for cultural dissemination and the future of classical music.

A 2017 article in The New York Times Magazine suggested that YouTube had become “the new talk radio” for the far right. Almost a year before YouTube’s January 2019 announcement to reduce recommendations of borderline content and misinformation, Zeynep Tufekci wrote that YouTube could be one of the most powerful radicalizing instruments of the 21st century.

After YouTube changed its recommendation system, the most recommended channel shifted from conspiracy theorist Alex Jones in 2016 to Fox News in 2019.

A 2020 study suggested that while journalists often blame YouTube’s recommendation engine for political radicalization, this view might be premature. Instead, the study proposed a “Supply and Demand” framework for analyzing YouTube politics.

A 2022 study found little systematic evidence to support the idea that YouTube’s algorithms lead people to extremist content. It noted that exposure to such content is concentrated among a small group with high levels of gender and racial resentment. Contrary to the “rabbit holes” narrative, non-subscribers are rarely recommended extremist videos and seldom follow these recommendations.

The Legion of Extraordinary Dancers and the YouTube Symphony Orchestra selected members based on individual video performances. The charity video “We Are the World 25 for Haiti (YouTube edition)” combined performances from 57 singers worldwide into one song. The Tokyo Times highlighted the “We Pray for You” video as an example of using crowdsourcing for charity.

The anti-bullying It Gets Better Project started with a single video aimed at LGBT teens and quickly grew, drawing responses from figures like President Obama and Vice President Biden. Amanda Todd’s video on bullying and suicide led to legislative action to study and address bullying after her death.

In May 2018, YouTube deleted 30 drill music videos after London Metropolitan Police claimed they promoted gang violence.

Finances

Before 2020, Google didn’t provide detailed figures for YouTube’s costs. In 2007, YouTube’s revenues were considered “not material.” A 2008 Forbes article projected YouTube’s revenue for that year at $200 million, noting progress in ad sales.

In 2012, YouTube’s ad revenue was estimated at $3.7 billion. By 2013, it nearly doubled to an estimated $5.6 billion according to e-Marketer, though some estimates were around $4.7 billion. Most YouTube videos are free to view and supported by ads.

In May 2013, YouTube tested a scheme of 53 subscription channels, with prices from $0.99 to $6.99 a month, aiming to compete with Netflix, Amazon Prime and Hulu.

Google first published YouTube’s exact revenue numbers in February 2020, as part of Alphabet’s 2019 financial report. In 2019, YouTube made $15.1 billion in ad revenue, up from $8.1 billion in 2017 and $11.1 billion in 2018. YouTube’s revenue accounted for nearly 10% of Alphabet’s total revenue in 2019. This included approximately 20 million subscribers to YouTube Premium and YouTube Music and 2 million subscribers to YouTube TV.

In 2022, YouTube’s ad revenue reached $29.2 billion, an increase of $398 million from the previous year.

Partnership with Corporations

In June 2006, YouTube partnered with NBC for marketing and advertising. In March 2007, it made a deal with BBC for three channels featuring BBC content—one for news and two for entertainment.

In November 2008, YouTube reached agreements with MGM, Lions Gate Entertainment and CBS to post full-length films and TV episodes on the site, with ads, in a section called “Shows” for U.S. viewers. This move aimed to compete with Hulu which features content from NBC, Fox and Disney.

In November 2009, YouTube launched a version of “Shows” for UK viewers, offering around 4,000 full-length shows from more than 60 partners. In January 2010, YouTube introduced an online film rental service available in the U.S., Canada and the UK offering over 6,000 films.

2017 Advertiser Boycott

In March 2017, the UK government stopped advertising on YouTube after discovering its ads appeared on extremist content. They demanded assurances that ads would be placed safely. The Guardian and other major British and U.S. brands also suspended their YouTube ads for the same reason. Google responded by stating it would review its advertising policies and make changes to give brands more control over ad placement.

In early April 2017, the YouTube channel h3h3Productions claimed that a Wall Street Journal article had fabricated screenshots showing major brand ads on an offensive video. They argued that the video hadn’t earned any ad revenue and it was later found that the ads were triggered by the use of copyrighted content in the video.

On April 6, 2017, YouTube announced changes to ensure that revenue only goes to creators who follow the rules. They required channels to undergo a policy compliance review and have at least 10,000 lifetime views before they could join the Partner Program.

YouTuber Earnings

In May 2007, YouTube launched its Partner Program (YPP) which is based on AdSense. This program allows video uploaders to share revenue from advertising on the site. YouTube takes 45% of the advertising revenue, while 55% goes to the uploader.

There are over two million members in the YouTube Partner Program. In 2013, TubeMogul reported that a pre-roll ad on YouTube (an ad shown before the video starts) cost advertisers an average of $7.60 per 1,000 views. Usually, fewer than half of eligible videos have a pre-roll ad because of a lack of interested advertisers.

YouTube’s policies restrict certain content from being monetized with advertising. This includes videos with violence, strong language, sexual content and “controversial or sensitive subjects and events” like war, political conflicts, natural disasters and tragedies, even if they don’t show graphic images. However, content that is newsworthy or comedic with the intent to inform or entertain may be exceptions. Additionally, videos with inappropriate user comments can also be demonetized.

In 2013, YouTube allowed channels with at least 1,000 subscribers to require a paid subscription for viewers to watch their videos. In April 2017, they set a requirement of 10,000 lifetime views for paid subscriptions. On January 16, 2018, the eligibility for monetization was changed to 4,000 hours of watch-time in the past 12 months and 1,000 subscribers. This move aimed to ensure monetized videos did not cause controversy but was criticized for penalizing smaller channels.

Play Buttons & Monetization Policies

YouTube Play Buttons are part of the YouTube Creator Rewards, recognizing the most popular channels. The awards are given based on subscriber milestones:

  • Silver Play Button: 100,000 subscribers (nickel-plated copper-nickel alloy)
  • Gold Play Button: 1 million subscribers (gold-plated brass)
  • Diamond Play Button: 10 million subscribers (silver-plated metal)
  • Custom Play Button: 50 million subscribers (ruby and red-tinted crystal glass)
  • Red Diamond Play Button: 100 million subscribers (red-tinted crystal glass)

YouTube’s policies restrict monetization for videos containing strong violence, language, sexual content and “controversial or sensitive subjects,” including war, political conflicts, natural disasters and tragedies, unless they are newsworthy or comedic. In September 2016, YouTube introduced an enhanced notification system for these violations which was criticized by users like Philip DeFranco and Vlogbrothers. They argued that demonetization was a form of censorship. YouTube clarified that the policy was not new but had improved the notification and appeal process. In 2019, it was reported that LGBT keywords often led to demonetization.

As of November 2020 in the United States and June 2021 worldwide, YouTube can monetize any video on the platform, even if the uploader is not part of the YouTube Partner Program. If the content is deemed “advertiser-friendly,” all ad revenue will go directly to Google, with none shared with the uploader.

Revenue to Copyright Holders

Most of YouTube’s advertising revenue goes to publishers and video producers who hold the rights to their videos, while YouTube keeps 45%. In 2010, it was found that nearly a third of videos with ads were uploaded without the copyright holders’ permission. YouTube allows copyright holders to either remove their videos or let them run and earn revenue.

In May 2013, Nintendo started enforcing its copyright and claimed ad revenue from creators who posted game content. In February 2015, Nintendo agreed to share this revenue through the Nintendo Creators Program. However, on March 20, 2019, Nintendo announced it would end the Creators Program and it ceased operations on that date.

Censorship & Bans

YouTube has been censored, filtered or banned for various reasons including:

  • Preventing Social or Political Unrest: Limiting access to content that may cause unrest.
  • Preventing Criticism: Blocking criticism of rulers (e.g., North Korea), governments (e.g., China), officials (e.g., Turkey and Libya) or religions (e.g., Pakistan).
  • Morality-Based Laws: Bans based on moral grounds such as in Iran.

Specific videos may be blocked due to copyright and intellectual property laws (e.g., in Germany), hate speech violations or being deemed inappropriate for youth. YouTube uses the YouTube Kids app and “restricted mode” to manage such content.

Businesses, schools, government agencies and other institutions often block YouTube to save bandwidth and prevent distractions.

As of 2018, YouTube is blocked in several countries including China, North Korea, Iran, Turkmenistan, Uzbekistan, Tajikistan, Eritrea, Sudan and South Sudan. In some countries, access to YouTube is blocked temporarily during periods of unrest, before elections or around political anniversaries. If YouTube is banned due to a specific video, the platform often agrees to remove or restrict access to that video to restore service.

Since October 2019, reports have emerged that comments posted with Chinese characters insulting the Chinese Communist Party (such as “communist bandit” or “50 Cent Party” referring to state-sponsored commentators) are automatically deleted within 15 seconds.

Specific Incidents of Being Blocked

  • Thailand blocked YouTube in April 2007 because of a video that was said to insult the Thai king.
  • Morocco blocked YouTube in May 2007, possibly due to videos criticizing Morocco’s occupation of Western Sahara. The site became accessible again on May 30, 2007, after Maroc Telecom unofficially claimed the block was just a “technical glitch.”
  • Turkey blocked YouTube from 2008 to 2010 due to videos considered insulting to Mustafa Kemal Atatürk. In November 2010, the site was briefly blocked again over a video featuring politician Deniz Baykal and was threatened with another shutdown if the video wasn’t removed. Despite the ban, YouTube remained the eighth-most-visited site in Turkey during this time. In 2014, Turkey blocked YouTube again due to a high-level intelligence leak.
  • Pakistan blocked YouTube on February 23, 2008, because of “offensive material” towards Islam. This caused a nearly global blackout of YouTube for about two hours as the block affected other countries. The ban was lifted on February 26, 2008, after YouTube removed the objectionable content. Many Pakistanis used VPNs to bypass the three-day block. In May 2010, Pakistan blocked YouTube again due to “growing sacrilegious content.” The ban was lifted on May 27, 2010, but individual videos offensive to Muslims could still be blocked. In September 2012, Pakistan blocked YouTube again after the site refused to remove the film “Innocence of Muslims.” The ban was lifted in January 2016 when YouTube launched a Pakistan-specific version.
  • Libya blocked YouTube on January 24, 2010, due to videos showing protests in Benghazi by families of detainees killed in Abu Salim prison in 1996 and videos of Libyan leader Muammar Gaddafi’s family at parties. Human Rights Watch criticized the block. After the Libyan Civil War, YouTube was unblocked in November 2011.
  • In September 2012, Afghanistan, Bangladesh, Pakistan and Sudan blocked YouTube due to a controversial 14-minute trailer for the film “Innocence of Muslims.” A court in Chechnya also ruled to ban the film. The video was blamed for violent protests in Libya and Egypt. YouTube said the video met their guidelines and stayed on the site but they temporarily restricted access in Libya and Egypt due to the unrest.
  • Following Russia’s invasion of Ukraine in February 2022, YouTube announced on March 1 that it would immediately remove RT and other Russian-government-funded outlets from its platform in Europe and this removal was soon expanded globally.

YouTube has turned into the largest school in the world. – Michael Noer (Forbes)

Share this post :

Facebook
Twitter
LinkedIn
Pinterest

Create a new perspective on life

Your Ads Here (365 x 270 area)
Latest News
Categories

Subscribe our newsletter

Purus ut praesent facilisi dictumst sollicitudin cubilia ridiculus.