I admire Wikipedia for trying to create a common knowledge database that everyone could access and participate in, but ultimately the site learned what politicians and sociologists realized long ago: when left to their own devices, people can't be trusted. Wikipedia announced it's adding a layer of editorial control to its content (stronger than the current layer of volunteers scanning various pages on their own). Now changes to certain topic pages will need approval before they go live on the site. What do the changes say about Wikipedia's purpose, and its future? I always thought Wikipedia was community-created first, but it seems the site would rather be accurate. Wasn't that what encyclopedias used to be; created by a community of experts, but vetted for accuracy?
I wouldn't argue against Wikipedia's value, but it's authoritativeness has certainly been questioned time and again. Most schools don't allow students to cite the website, and many journalistic standards warn against trusting Wikipedia's article content. While that doesn't affect the millions of people checking the site for updates about their favorite celebrities and TV shows, its creators are probably seeking more credibility with this move toward editorial control.
Will this be the fate of other social media sites? Open content to the masses, realize the masses include some unsavory characters, and close the gates slowly to control what gets in? My guess is, probably. As social media becomes more mainstream, the industry will become more regulated and self-policing, leading us to reexamine exactly what "open" and "social creation" mean on the web.