View original post on Mashable
Learn how to play guitar through youtube, via soundslice |
Tablature files (or “tabs” — a simplified guitar notation in plain text format) are aggregated by shady content farms with strong SEO and dubious quality control. YouTube videos provide audio and visual instruction, but require patience and the ability to “read the fingers” of the performer.
That’s why Soundslice is a revelation for self-taught musicians. Built on YouTube’s API, it’s a transcription interface that syncs tablature and videos so players get the best of both worlds. You can also play the video at half speed (without changing the pitch) and loop small sections if you’re trying to pin down a tricky riff. Everything functions in your web browser or iPad — there’s no software or apps to install.
While these tools are outstanding in their own right, the big promise here is in creating a rich trove of living, accurate guitar tutorials for everyone on the web to enjoy.
“My goal was to make something for myself, to make transcription less painful,” the site’s founder Adrian Holovaty tells Mashable. “I’d spend hours transcribing stuff, either on paper or in lousy text files, then I’d come back to it later and have to re-listen to the music to make sense of my own tab. I started to think, it would be so much easier to learn if the tab were synced with the original audio.”
Soundslice uses YouTube’s official HTML5 JavaScript API, which allows developers to control videos using their own interface. Users can work from any YouTube video, not just their own. The transcription editor UI is similar to multi-track recording software. Add a track for chords, tab notation, song structure (chorus, verse, bridge) and start plotting.
Drag the length of the note on the string and add the fret number. Space bar will start and pause the video. You quickly realize that Soundslice adds a temporal dimension to tablature without need of time signature or measures. If the community takes off, it could fundamentally change how the Internet thinks about, creates and shares this kind of notation.
I asked Holovaty about the potential for Soundslice to become a social network.
“It can become a commons for user-generated musical annotations and transcriptions,” he says. “At the moment, social interaction is very limited — you can see other people’s annotations and see all the other videos they’ve annotated and that’s it. But obviously, there’s a ton of potential to do more.”
Holovaty envisions the classic 20/80 split — 20% of users will create the content for the other 80%. “Originally I imagined it to be for relatively advanced musicians, but I’ve already seen some simpler stuff come through the system. Never underestimate the power of bored high school or college students who want to learn music!”
That work will also be connected to your Soundslice account. Savvy transcribers might sync their own videos to teach, thus generating views and ad revenue from YouTube’s partner program. There’s a lot of potential for power users.
Quality Control of UGC
Current tab repositories are a cluttered mess. A song might have 20 versions, each with its own errors or embellishments. Quality control of user generated content can be a challenge, but Holovaty sees two potential modes.“I’m considering both a revision-control model (like GitHub) or a Wikipedia model,” he explains. “What would make more sense for annotations: “branching” changes where everybody owns their own data and accepts pull requests, or a more shared wiki-style thing where anybody can edit anything, with revertible history? I’ve been thinking about it for a long time and am still not sure.”
Business Model
The other challenge for UGC networks is monetization.
“I’m planning to add a pay-for version, where you can upload your own tracks as opposed to relying on YouTube,” Holovaty says of his future business model. “Plus, you’d get some niceties like a graphical waveform display, more fine-grained slowdown and an automated first-pass at the transcription (which would be imperfect but at least a starting point). I’d also like to talk to music education companies who might want to pay me to embed the interface into their own sites.”
Regarding that automated transcription, Holovaty is experimenting with software created by The Echo Nest — algorithms that power music analysis apps like Spotify and iheartradio.
Turns out, auto-generating sheet music from a complex recording is still a long way off. It’s a computational feat akin to sentient artificial intelligence, says Holovaty, who has been studying this kind of technology for some time.
“The Echo Nest has some nice APIs that make automated guesses at the underlying musical information in an audio recording,” he says. This means that a human could provide the framework for the transcription, and an algorithm could work from that to fill in the gaps, speeding up the process. Holovaty hopes to include this in future paid accounts.
For now, he and his designer PJ Macklin (the only two people working on Soundslice at the moment) are looking forward to seeing what users create, and adding features based on feedback. “Next up is the long list of feature additions and improvements. And of course, paying the bills and getting people to use it!”"
Thumbnail image courtesy of Dylan Adams, Flickr.