Connect with us


The music industry uses your social data to predict it’s next big artists



Fifteen years ago, Steve Jobs introduced the iPod. Since then, most music fans have understood this has radically changed how they listen to music.

Less understood are the ways that raw information – accumulated via downloads, apps and online searches – is influencing not only what songs are marketed and sold, but which songs become hits.

Decisions about how to market and sell music, to some extent, still hinge upon subjective assumptions about what sounds good to an executive, or which artists might be easier to market. Increasingly, however, businesses are turning to big data and the analytics that can help turn this information into actions.

Big data is a term that reflects the amount of information people generate – and it’s a lot. Some estimate that today, humans generate more information in one minute than in every moment from the earliest historical record through 2000.

Unsurprisingly, harnessing this data has shaped the music industry in radical new ways.

When it was all about the charts

In the 20th century, decisions about how to market and sell music were based upon assumptions about who would buy it or how they would hear it.

At times, purely subjective assumptions would guide major decisions. Some producers, like Phil Spector and Don Kirshner, earned reputations for their “golden ears” – their ability to intuit what people would want to listen to before they heard it. (If you aren’t aware of the SNL parody of this phenomenon, take a second to see “More Cowbell.”) Eventually, record companies incorporated more market-based objective information through focus groups, along with sheet music and record sales.

But the gold standard of information in the music industry became the “charts,” which track the comparative success of one recording against others.

Music charts have typically combined two pieces of information: what people are listening to (radio, jukeboxes and, today, streaming) and what records they’re buying.

Charts like the Billboard Hot 100 measure the exposure of a recording. If a song is in the first position on a list of pop songs, the presumption is that it’s the most popular – the most-played song on the radio, or the most-purchased in record stores. In the 1920s through the 1950s, when record charts began to appear in Billboard, they were compiled from sales information provided by select shops where records were sold. The number of times a recording played on the radio began to be incorporated into the charts in the 1950s.

While charts attempt to be objective, they don’t always capture musical tastes and listening habits. For example, in the 1950s, artists started appearing on multiple charts presumed to be distinct. When Chuck Berry made a recording of “Maybellene” that simultaneously appeared in the country and western, rhythm and blues, and pop charts, it upended certain assumptions that undergirded the music industry – specifically, that the marketplace was as segregated as the United States. Simply put, the industry assumed that pop and country were Caucasian, while R&B was African-American. Recordings like “Maybellene” and other “crossover” hits signaled that subjective tastes weren’t being accurately measured.

In the 1990s, chart information incorporated better data, with charts automatically being tracked via scans at record stores. Once sales data began to be accumulated across all stores using Nielsen Soundscan, some larger assumptions about what people were listening to were challenged. The best-selling recordings in the early 1990s were often country and hip-hop records, even though America’s radio stations during the 1980s had tended to privilege classic rock.

Record charts are constantly evolving. Billboard magazine has the longest-running series of charts evaluating different genres and styles of music, and so it makes a good standard for comparison. Yet new technology has made this system a bit problematic. For example, data generated from Pandora weren’t added to the Billboard charts until January of this year.

The end of genre?

Today, companies are trying to make decisions relying on as few assumptions as possible. Whereas in the past, the industry relied primarily on sales and how often a songs were played on the radio, they can now see what specific songs people are listening to, where they are hearing it and how they are consuming it.

On a daily basis, people generate 2.5 exabytes of data, which is the equivalent to 250,000 times all of the books in the Library of Congress. Obviously, not all of this data is useful to the music industry. But analytical software can utilize some of it to help the music industry understand the market.

The Musical Genome, the algorithm behind Pandora, sifts through 450 pieces of information about the sound of a recording. For example, a song might feature the drums as being one of the loudest components of the sound, compared to other features of the recording. That measurement is a piece of data that can be incorporated into the larger model. Pandora uses these data to help listeners find music that is similar in sound to what they have enjoyed in the past.

This approach upends the 20th-century assumptions of genre. For example, a genre such as classic rock can become monolithic and exclusionary. Subjective decisions about what is and isn’t “rock” have historically been sexist and racist.

With Pandora, the sound of a recording becomes much more influential. Genre is only one of 450 pieces of information that’s being used to classify a song, so if it sounds like 75 percent of rock songs, then it likely counts as rock.

Meanwhile, Shazam began as an idea that turned sound into data. The smartphone app takes an acoustic fingerprint of song’s sound to reveal the artist, song title and album title of the recording. When a user holds his phone toward a speaker playing a recording, he quickly learns what he is hearing.

The listening habits of Shazam’s 120 million active users can be viewed in real time, by geographic location. The music industry now can learn how many people, when they heard a particular song, wanted to know the name of the singer and artist. It gives real-time data that can shape decisions about how – and to whom – songs are marketed, using the preferences of the listeners. Derek Thompson, a journalist who has examined data’s affects on the music industry, has suggested that Shazam has shifted the power of deciding hits from the industry to the wisdom of a crowd.

The idea of converting a recording’s sound into data has also led to a different way of interpreting this information.

If we know the “sound” of past hits – the interaction between melody, rhythm, harmony, timbre and lyrics – is it possible to predict what the next big hit will be? Companies like Music Intelligence Solutions, Inc., with its software Uplaya, will compare a new recording to older recordings to predict success. The University of Antwerp in Belgium conducted a study on dance songs to create a model that had a 70 percent likelihood of predicting a hit.

Of course, YouTube might tend to cluster songs by genre in its search algorithm, but it’s increasingly clear that the paradigms that have defined genres are less applicable now than ever before.

What happens next?

Even as new information becomes available, old models still help us organize that information. Billboard Magazine now has a Social 50 chart which tracks the artists most actively mentioned on the world’s leading social media sites.

In a way, social media can be thought of as analogous to the small musical scenes of the 20th century, like New York City’s CBGB or Seattle’s Sub Pop scene. In Facebook groups or on Twitter lists, some dedicated and like-minded fans are talking about the music they enjoy – and record companies want to listen. They’re able to follow how the “next big thing” is being voraciously discussed within a growing and devoted circle of fans.

Streaming music services are increasingly focused upon how social media is intertwined with the listening experience. The Social 50 chart is derived from information gathered by the company Next Big Sound, which is now owned by Pandora. In 2015, Spotify acquired the music analytics firm The Echo Nest, while Apple Music acquired Semetric .

Songwriters and distributors now know – more than ever – how people listen to music and which sounds they seem to prefer.

But did people like OMI’s 2015 hit “Cheerleader” because of its sound and its buzz on social media – as Next Big Sound predicted? Or did it spread on these networks only because it possessed many of the traits of a successful record?

Does taste even matter? You’d like to think you listen to what you enjoy, not what the industry predicts you’ll like based on data. But is your taste your own? Or will the feedback loop – where what you’ve enjoyed in the past shapes what you hear today – change what you’ll like in the future?

This article was originally published on The Conversation. Read the original article.

Technology News on The Conversation



Apple’s enterprise evolution



Back in 2010, Apple’s iconic co-founder Steve Jobs was not entirely enthralled with the enterprise. In fact, Jobs is famously quoted as saying, “What I love about the consumer market, that I always hated about the enterprise market, is that we come up with a product, we try to tell everybody about it, and every person votes for themselves.”

He added, “They go ‘yes’ or ‘no,’ and if enough of them say ‘yes,’ we get to come to work tomorrow. That’s how it works.”

That was an accurate enough representation of the way things worked when Jobs made the statement. Back in those days, IT kept tight control over the enterprise, issuing equipment like BlackBerries and ThinkPads (and you could have any color you wanted — as long as it was black). Jobs, who passed away in 2011, didn’t live long enough to see the “Bring Your Own Device” (BYOD) and “Consumerization of IT,” two trends that were just hovering on the corporate horizon at the time of his death.

I have the feeling he would have quite liked both movements and would have taken great pleasure in the fact that in many ways those trends were driven by his company’s mobile devices, the iPhone and the iPad. People were using those devices at home and they were increasingly bringing them to work. IT had little choice but to begin accommodating them.

That movement has helped fuel Apple’s enterprise evolution. Over time, Apple has partnered with enterprise stalwarts like IBM, SAP and Cisco. It has provided tools for IT to better manage those i-devices, and Macs, too, and it has built the enterprise into a substantial business (to the extent that we can tell).

What do we have here?

Trying to find data on the size of Apple’s enterprise business is a challenge because it doesn’t often break out enterprise revenue in earnings calls, but to give you a sense of the market, Tim Cook did reveal a number in the Q4 2015 earnings call.

“We estimate that enterprise markets accounted for about $25 billion in annual Apple revenue in the last 12 months, up 40 percent over the prior year and they represent a major growth vector for the future,” Cook said at the time.

In a June 2017 Bloomberg interview, Cook didn’t provide any numbers, but he did call the enterprise, “the mother of all opportunities.” That’s because enterprises tend to buy in bulk, and as they build an Apple support system in-house, it feeds other parts of the enterprise market as companies buy Macs to build custom apps for both internal users and consumers of their products and services.

This connection did not escape Cook in the Bloomberg interview. “For most enterprises, iOS is the preferred mobile operating system. IOS is a fantastic platform because of the ease with which you can write apps that are great for helping you run your business efficiently or interface with your customers directly. We see many, many enterprises now writing apps. Well, what do they use to write the apps? They use the Mac. The Mac is the development platform for iOS,” Cook told Bloomberg.

Photo: Justin Sullivan/Getty Images

Another way to look at the market is to look at Jamf, an Apple enterprise tool partner that helps companies manage Apple devices in large organizations. The company, which launched in 2002 long before the iPad or the iPhone, has been growing in leaps and bounds. It reports it has 13,000 customers today. To put that into perspective, it took 13 years to reach 6,000 customers and just 2.5 years to more than double to 13,000.

“A lot of people say Apple is getting more focused on enterprise, but I believe Apple helped enterprise focus more on users and they’ve had more success,” Jamf CEO Dean Hager told TechCrunch. “It started with Apple creating great products people wanted to bring to work and then they just demanded it,” he said.

Forcing their way into the enterprise

That organic momentum can’t be underestimated, but once it got in, Apple had to give IT something to work with. IT has always seen its role as hardware and software gatekeeper, keeping the enterprise safe from external security threats.

Ultimately the company never set out to build out enterprise-grade devices with the iPhone and iPad. They simply wanted devices that worked better than what was out there at the time. That people liked to use them so much that they brought them to work was an extension of that goal.

In fact, Susan Prescott, vice president of markets, apps and services at Apple was at the company when the first iPhone was released, and she was aware of the company’s goals. “With iPhone, we set out to completely rethink mobile, to enable the things we knew that people wanted to do, including at work,” she said.

Susan Prescott of Apple. Photo: Justin Sullivan/Getty Images

The notion of apps and the App Store and bringing in developers of all ilks to build them was also attractive to enterprises. When IBM and SAP got involved, they began building apps specifically geared towards enterprise customers. Customers could access these apps from a vetted App Store, which also was appealing to IT. The Cisco deal gave IT faster on-boarding of Apple devices on networks running Cisco equipment (which most enterprises use).

At the 2010 iPhone 4 keynote, Jobs was already touting the kinds of features that would appeal to enterprise IT, including mobile device management, wireless app distribution through the App Store and even support for Microsoft Exchange Server, the popular corporate email solution of choice at the time.

He may have spoken derisively about the enterprise in a general sense, but he clearly saw the potential of his company’s devices to transform the way people worked by giving them access to tools and technologies that previously were not in reach of the average worker.

Apple also was quietly talking to enterprises behind the scenes and figuring out what they needed from the earliest days of the iPhone. “Early on we engaged with businesses and IT to understand their needs, and have added enterprise features with every major software release,” Prescott told TechCrunch.

Driving transformation

One of the factors driving the change inside organizations was that mobile and cloud were coming together in that 2011 time frame, driving business transformation and empowering workers. If IT wouldn’t give employees the tools they wanted, the App Store and similar constructs gave them the power to do it themselves. That fueled the BYOD and Consumerization of IT movements, but at some point IT still required some semblance of control, even if that didn’t involve the same level they once had.

The iPhone and other mobile devices began to create the mobile worker, who worked outside the protection of the firewall. People could suddenly look at their documents while waiting for the train. They could update the CRM tool in-between clients. They could call a car to get to the airport. All of this was made possible by the mobile-cloud connection.

It was also causing a profound change inside every business. You simply couldn’t do business the same way anymore. You had to produce quality mobile apps and you had to get them in front of your customers. It was changing the way companies do business.

It was certainly something that Capital One saw. They realized they couldn’t remain a “stodgy bank” anymore, and control every aspect of the computing stack. If they wanted to draw talent, they had to open up, and that meant allowing developers to work on the tools they wanted to. According to Scott Totman, head of Mobile, Web, eCommerce, and personal assistants at Capital One, that meant enabling users to use Apple devices for work, whether their own or those issued by the company.

Workers at Capital One. Photo: Capital One/Apple.

“When I came in [five years ago], the Apple support group was a guy named Travis. We weren’t using Apple [extensively] in the enterprise, [back then],” he says. Today, they have dozens of people supporting more than 40,000 devices.

It wasn’t just people inside the company whose needs were changing. Consumer expectations were changing, too, and the customer-facing mobile tools the company created had to meet those expectations. That meant attracting those app developers to the enterprise and giving them an environment where they felt comfortable working. Clearly, Capital One has succeeded in that regard, and they have found ways to accommodate and support that level of Apple product usage throughout the organization.

Continue Reading


Foursquare is finally proving its (dollar) value



In 2009, Facebook was just getting popular with moms and grandmas. People were playing Farmville. Twitter was just becoming mainstream. And Foursquare launched on to the scene.

Back then, Foursquare was just another social network, letting users check in to locations they visit and potentially receive badges for those check-ins.

A lot has changed since 2009, but Foursquare still remains, though not without some adversity. Today, in a review of 2017, Foursquare has announced that last year is the third year in a row in which Foursquare has seen at least 50 percent revenue growth.

Foursquare’s data, with over 3 billion visits/month around the globe, 105 million global venues, and 25 million users who have opted in to always-on location sharing, is incredibly valuable to advertisers, businesses and developers. But transitioning from a consumer app to an enterprise platform — going from an Instagram to a comScore — has not been without its trials.

In 2014, Foursquare decided to ditch its original legacy app. Instead, the company announced, it would offer check-ins and ambient location-sharing via a new app called Swarm and the new and improved Foursquare would focus solely on place recommendations, essentially turning Foursquare into a direct competitor to Yelp.

The only way to viably sell data to advertisers, businesses and developers is to have your own army of hungry, active users to provide that data to begin with. And the old Foursquare was bloated and directionless, with a variety of potential uses. In short, it felt stale during a time when new apps were springing up left and right.

But Foursquare knew that the data it was collecting on users would prove its value eventually. And it was able to continue convincing investors that that would be the case.

While the unbundling effort was a risky bet, it seems to have paid off for the company. Both apps have over 50 million monthly active users as of 2016, which has allowed Foursquare to put their foot on the gas with enterprise products.

For example, Pinpoint by Foursquare (an advertising product) now boasts more than half of the Ad Age 100 as advertisers. Attribution by Foursquare lets those brands measure how effective that advertising is. Attribution more than doubled revenue in 2017.

Developer tools are also an integral part of Foursquare’s business. The Pilgrim SDK and Places API “grew substantially,” according to a post by CEO Jeff Glueck, and now provides location tech to 125K+ developers.

Foursquare added 50+ new roles over 2017, including positions in engineering, sales, creative, business development, marketing, and ops. In 2018, the company is opening a new engineering office in Chicago, and plans to grow the team by 30 percent over the course of the year.

It’s taken nearly a decade, but Foursquare is finally proving that it can turn years of consumer data into a viable revenue stream.

Continue Reading


Okta teams up with ServiceNow to bring identity layer to breach containment



Okta and fellow cloud company ServiceNow got together to build an app that helps ServiceNow customers using their security operations tools find security issues related to identity and take action immediately.

The company launched the Okta Identity Cloud for Security Operations app today. It’s available in the ServiceNow app store and has been designed for customers who are using both toolsets. When a customer downloads and installs the app, it adds a layer of identity information inside the ServiceNow security operations interface, giving the operations team access to information about who specifically is involved with a security problem without having to exit their tool to find the information.

Okta is a cloud identity management company, while ServiceNow is a cloud service management company. ServiceNow approached Okta about this integration because research has shown that that vast majority of breaches are related to compromised user credentials. The sooner the security operations team can track down the source of those credentials, the sooner they can begin resolving situation.

The way it works is a company detects a breach through the ServiceNow security tool. Instead of searching logs and and taking days or weeks to find the source of the breach, security operations can see the problem user directly in the ServiceNow interface.

With that information, they can decide immediately how to mitigate the issue. That could involve forcing the person to log out of all their applications and logging back in with new credentials and two-factor identification, suspending the user for 24 hours or a number of other actions at the discretion of the security personnel.

Okta identity tools in the ServiceNow interface. Screenshot: ServiceNow

The combination of the two products results in a better solution for customers who are using both tools anyway, says Okta COO and co-founder Frederic Kerrest. “It reduces incident triage, improves risk scoring and accelerates containment,” he explained.

The integration takes advantage of the Okta Advanced Integration Network and involves a set of APIs for for inserting Okta functionality inside of other applications. Among the other companies Okta is working with on this kind of integration is Palo Alto Networks.

This is not the first time the two companies have worked together, says Kerrest. There have been a couple of other cases where ServiceNow has used Okta as the default identity management solution in their products.
Featured Image: nicescene/Getty Images Readmore

Continue Reading

Subscribe to our Newsletter