Dark Social

This panel dug deeper into the visibility of social sharing. How do we bring dark social into the light? The term itself spawned from the idea of “Dark Matter” – which is matter and energy that cannot be seen but which exerts a powerful force on the universe.

Dark social is a term coined by Alexis C. Madrigal, a senior editor at The Atlantic, to refer to the social sharing of content that occurs outside of what can be measured by Web analytics programs. – Wikipedia

Led by moderator, Mark Josephson, the panel batted around the good, the bad and the ugly of all things dark social. One overarching theme was that marketers need to demand that all things are measured – they need to optimize what they are doing. While one thinks they are accurately measuring, the truth is they are not. The big chunk of direct traffic is not what users think it is.

So where does it come from? The user didn’t type in a URL, there is no bookmark – this thing showing up in their analytics as direct traffic is being miscalculated. The notion is that there is a referrer but it’s not known because it’s coming from a platform or app that cannot be measured, like Slack or Instagram for example.

To make matter worse, the problem is not going away but instead will only mushroom because of the rapid rise of new platforms and messaging apps. Adding complexity to this is that most of the activity is taking place on mobile and most analytics programs were not originally designed for mobile measurement.

IMG_4552.jpg

According to panelist, Brewster Stanislaw, from Simply Measured, he estimated that analytics tools are potentially missing anywhere from 70-80% of P2P social sharing. (Other estimates from the panel ranged from 20-60%.) He explained that the traffic is being miscounted over 50% of the time – marketers think it’s direct traffic but it’s not. It’s links being shared over text or in apps or in ways that cannot be measured.

When asked about the battle to find referrers, Matt Thompson from Bitly broke down the three phases of the market as follows:

  1. First, the term dark social was identified as a real thing. The term was coined and marketers became aware of it.
  2. Next, came the Facebook era where users realized that Facebook was not getting the credit it deserved as a referrer because link sharing was not trackable.
  3. Finally, the third phase of the market is where users are now where they are seeing that chat apps and IM traffic is also a source of dark social and how can they parse user agents to be more accurate.

The moderator pushed the panel to identify specific ways analytics companies can help solve these challenge. For starters, education and awareness needs to be stronger about the fact that dark social is a problem. Once enough people raise the alarm and acknowledge that they are not measuring accurately, the industry will continue to find new ways to solve the problem.

As Josh explained, “Once companies find out that most of their traffic is not being counted accurately – the flip out. And they want it fixed immediately.”

Brewster emphasized the importance of the problem by explaining that the private peer to peer sharing – the link someone texts to their friend – is the “truest expression of organic intent from consumer.”

Another way of understanding how crucial this issue is to consider it a bottom of the funnel user activity. Social is typically at the top of the funnel but dark social is at the bottom meaning it’s closer to the user taking real action. 

A few tactical ideas for how to fix the issue are:

  • Aggressively leveraging UTM parameters
  • Using Bitly tracking features
  • Looking deeper at user agents

As the group wrapped up their discussion, the conversation moved towards the customer journey and how complex it is for marketers to measure now because of mobile devices and unattributed traffic. That said, everyone agreed that solving the challenge of dark social is critical because marketing teams must have access to accurate data in order to make decisions about their spend.

The Evolution of Visual Social Media

Two years ago, users were sharing 1.8 million images every day on social media. With 4 billion images now being shared daily, visual listening or “image intelligence” is fast becoming one of the hottest technologies in social media. Of the images containing a brand, 85% of them do not mention the brand in text, which means that if you are only analyzing on the basis of text, you are missing a huge swath of insight. This panel discussed how their respective organizations are pulling intelligence from images that can be used in predictive ways for business and research and how they see it evolving in the future.

Glen Szczypka, Principal Research Scientist for NORC at The University of Chicago, talked about using image recognition to study how media messaging and images affect public health concerns. With regard to tobacco use, they are looking at what images tobacco companies are using to attract customers; on the health side, they are working with organizations like the Truth Initiative to see what messages are effective in helping people stop smoking. He noted that it is particularly important to explore Twitter, Instagram, and other platforms where young adults are posting text and images related to smoking.

Visual 1

Ethan Goodman, Senior Vice President of Shopper Experience at The Mars Agency, helps Fortune 500 companies plan marketing activities with large retailers. For example, they are helping Campbell’s to sell more soup at Walmart, Kroger, and other stores. They use image recognition to get into the brains of customers and figure out how to market more effectively to them. Ethan is primarily focused on their customer’s clients in photos, as well as finding competitors’ logos. As they have evolved, they started combining image analysis with sentiment and object analysis. He noted that their agency is a customer of, and investor in, Ditto Labs.

Glen’s organization retrieves images from Instagram based on the basis of tags, such as #blunt when used as slang for inexpensive cigars. The tags both limit the data set to what is allowed by Instagram and also help to find objects, blunts, that can’t be recognized as easily as logos. They then used Ditto to recognize patterns in the image pixels to find logos (over 40% of the images they retrieve contain branded content). When analyzing which brands are featured in images tagged with #blunt, Swisher Sweets was by far the most common. This led Glen’s firm to look into whether the content was organic content or whether Swisher Sweets is encouraging people to post pictures of their blunts.

Ethan’s firm, by contrast, retrieves images without any text attributes. Looking through the vast stream of photos containing logos, they were interested and shocked by what they found–everything from traditional use cases, such as images with Campbell’s soup showing people making dinner to images related to a Vodka brand where people were putting Skittles into a Vodka bottle to make a lava lamp. Ethan then uses the technology to glean insight that can drive a creative idea or marketing strategy. He explained that it’s important to know what other brands are in a targeted brand’s advocacy set. He noted that in the near year, clients will be increasingly using this strategy to inform advertising decisions and ad targeting. You will be able to target someone who has been pictured with a particular logo a large number of times with a display ad for that brand.

As social media becomes increasingly visual, users are going to see more applications that feature buttons that allow them to learn or buy without having to leave an application. For example, a user will be able to click on a pair of glasses in an image, find retailers offering those glasses, and then even add them to their cart.

When asked how they train machines to find images, Glen explained that they train machines using an iterative process where a human will look at an image or text and identify it as, for example, “smoking” or “blunt.” After the machine finds images, they are compared with what the human found. If a bit off, they train the machine a bit more.

In terms of advice for companies wanting to start using image recognition, Ethan recommended that you start now. He noted that it is low risk to start. Think about how visual listening fits into your larger strategy, how it can complement your text listening tool. Glen noted that if you need to limit the data set, choose the tags you’re searching on carefully. For example, don’t use your brand as a keyword.

The panel also discussed ecosystem challenges related to the differences across platforms for data access. For example, users can generally pull images through APIs; however there are limits. A user can only access publicly-available images on Facebook. Instagram recently changed their terms so that a development license is required to use their API. Other social media platforms simply don’t allow users to access their images. Instead, one might not be able to pull 100% of what is available, but rather retrieve a solid representative sample.

You should also set expectations about not only what is available, but how long it takes to retrieve the data for analysis. Ethan mentioned that it takes between 3-7 days to train a machine to precisely identify a logo. Definitely something to keep in mind when thinking about your future image recognition efforts.

The Power of Live

Two days ago, the Speaker of the House shut down C-SPAN camera access on the floor of the House of Representatives, disallowing Democratic lawmakers the news coverage they wanted for the sit-in they staged in protest of firearm legislation that failed to get enough votes. Oddly, as Sara Haider, Client Engineering Lead at Periscope, pointed out, “Our technology is meant to sort of give a voice to the voiceless. But ironically, our lawmakers–at this point–were the ones who were voiceless.”

Haider was talking about the live-stream takeover orchestrated by four lawmakers who took to Periscope during the sit-in. Periscope is the live-streaming technology tool that is both a stand-alone app and also integrates with Twitter and is where live video news is known to break. When the lawmakers started broadcasting their sit-in, it created the biggest audience to a live-stream feed that Periscope had ever seen. Haider herself found out about the feed on Twitter, where the buzz just kept getting louder. Eventually, C-SPAN decided to start using the Periscope broadcast from the House floor as their aired footage; the Periscope team had to get in touch with the network in order to educate them on just how the tool should be used, including everything from basic use etiquette to changing the stream on television from portrait to landscape mode.

Periscope 1

Periscope, for those who are unfamiliar, is the hot live-streaming app that launched a thousand new competitors (or really, just a few large ones). Haider described the company’s vision as “a way to see the world through someone else’s eyes.” Users can download the app and start broadcasting live to anyone in the world and their feed can be joined by anyone, anywhere. Viewers can engage with the broadcaster by commenting or tapping their screen, which show the broadcaster visual “hearts” on-screen, encouraging them to continue their broadcast or agreeing with something they’ve said or done. Think finger-snapping at a poetry reading. Shortly after Periscope launched and began to catch fire, social giants like Facebook and Snapchat began creating their own tools for live-streaming, seeing the opportunity that clearly existed with this new type of audience.

Despite the large names that have suddenly become competition, Haider remains unfazed, indicating that, while Periscope currently operates on its own platform and integrates with only Twitter, they are willing and able to integrate across all social ecosystems. The brand seems to be working to build a loyal following that would be willing to use their specific live-streaming tool no matter which social media platform they’re using at the time.

That following of users is definitely building. Much like a Snapchat audience, many Periscope feeds are about the user themselves: who they are, what they’re doing at any given point in time. Many speakers and event marketers are using Periscope to create more attention and attendance for their conferences and seminars. A growing audience can also be found in Periscope communities; like-minded people are coming together for real-time meetup groups through the power of live-streaming, no matter where they happen to live. Where the internet gave individuals access to one another in conversation, Periscope gives them the ability to have coffee in a live meet-up group, physically watching and interacting with the broadcaster.

What does the future hold for Periscope? Live video apps may not seem as though they present a lot of opportunity for competitive differentiation, but there is plenty of unbroken ground to be discovered. Marketers should look for best practices guides as the tool grows in popularity; typical users should start to look for even more convenience of use and greater ecosystem integration and accessibility as time goes on. In any case, it is clear that the future of technology is in real-time visual; the data it can provide remains to be seen, but should most certainly be monitored closely!

Welcome Day 2!

Day 2 of Big Boulder has officially started!

Screen Shot 2016-06-24 at 9.33.04 AMThe sun is shining, and it is already a beautiful Colorado day outside. Earlier this morning a few attendees were able to catch a glimpse of His Holiness the Dalai Lama as he exited the St. Julien Hotel, as apparently good company is kept here at the #BigBoulder Conference. 🙂 Other attendees participated in a nearby hike from Chautauqua National Historic Landmark up to Boulder’s signature Flatirons rock formations, capturing beautiful views of the city all the while. It is certainly shaping up to be a good one!

The day’s sessions kicked off with moderators Mark Josephson, the CEO of Bitly, and Farida Vis, Director of the Visual Social Media Lab at Sheffield University, discussing a few topics from yesterday and reiterating that the BBI channel on Slack is a key part of the BBI mission and focus for ongoing programming. The flow of information, as they say, is crucial for building community through Slack.

They closed their short introduction by mentioning that the Big Boulder Initiative would be having a discounted membership drive from now until next Friday (July 1st, 2016), with a BOGO membership costing $250. Then it was “down to business” again with the introduction of the first interview with Sara Haider, the Client Engineering Lead at Periscope.

If you would like a recap of Day 1 of BBI 2016, be sure to check out this link below!

https://blog.bbi.org/2016/06/24/thoughts-from-big-boulder-day-1/

 

Thoughts from Big Boulder – Day 1

Day 1 of Big Boulder 2016 is a wrap!

Emcee Farida Vis, Director of the Visual Social Media Lab, Faculty Research Fellow in the Information School at Sheffield University, shared her three take-aways from the day:

  1. The talks today revealed that there are a lot of topics our industry is getting better at. And there’s still a lot of stuff we’re not good at.
  2. The “Social Data and Politics” panel revealed that everyone is analyzing data, but many of the methodologies aren’t great. Yet.
  3. Building strong teams, tenacity,  being brave, and doing first/asking for permission later are concepts we toss around. The “Connecting with Customers” panel showcased real examples of someone walking the talk.

And in case you missed anything along the way, you can view of all the blog coverage from Day 1 of Big Boulder 2016 below:

We look forward to seeing you all again tomorrow, so stay tuned for more great presentation content from Day 2 of Big Boulder 2016!

Social Listening 3.0

Jonathan Farb, Chief Product Officer at ListenFirst Media, gave the second “Pecha Kucha” presentation of the day on Social Listening 3.0. To review, Pecha Kucha is a Japanese presentation style that focuses on the spirit of efficiency by having the narrator speak on exactly 20 slides, each of which is shown for only 20 seconds (6 minutes and 40 seconds in total).

Jon 1

Jon started out by saying that if you were to ask everyone in the room to define social listening, you would get a large number of different answers, including:

  • Sentiment
  • Positive/negative/neutral
  • Consumer reaction
  • Social media data
  • Forums

Jonathan asserted that “we sell social listening to tap into the world’s largest focus group” to understand what they’re doing, what they are saying, what they want to buy.

Up until recently, social listening has primarily focused on computers analyzing text. In the beginning, we had forums, BBSes, message boards, and similar tools. Social Listening 2.0 was able to structure conversation and topics in a way that supported enhanced analysis capabilities, such as measuring the volume, extracting the sentiment, and performing other text processing.

Then social listening changed again by adding “clicks,” such as clicking on icons to follow, like, or Retweet. It became immediately clear that not everyone was a content author and that some users liked to click. A lot. Tracking clicks allowed us to shift from the qualitative to quantitative and to analyze what someone was saying without them actually
having to say it.

The premise of Social Listening 3.0 is that conversation is not the entire story. If you are still evaluating the value of your brand based on text only, you are under-monetizing other engagements available to you. Not taking advantage of clicks (the rest of the story) is a real problem.

After conducting research with their clients, Jonathan’s firm found that less than 50% of fans actually post content. On the other hand, 100% of social media users click, providing valuable information that can be leveraged for ad strategies. For example, of 4.58MM digital engagements, 80% of the monetizable data for The Walking Dead was not analyzed.

You can also glean more insights than when analyzing clicks than what users actually type out and apply the insights to do optimization and competitive intelligence better. For example, Tesla is one of the lowest conversed brands on the Internet, but always in the top 5% with fan engagements.

The uptake is that if you’re still using conversation analysis as the only method for moving millions of dollars around, you’re obviously not getting the full story.

Social Data and Finance

To kick off this panel, David Schweidel, Associate Professor of Marketing at Emory University, asked a few general questions such as: “What are the best practices for brands as far as social media is concerned?” in addition to what the importance of “how some of these brands are leveraging social data”. How can an organization take stock of where they are at that moment in time, where their potential lies, and what factors are still holding them back?

Ms. Sunayna Tuteja, the Director of Social Media & Online Communities at TD Ameritrade, was the first to respond. She spoke of how Ameritrade is utilizing social data to serve their customers better because, for many years, those in financial services had to remain content with sitting back and peering enviously as companies in other industries were surging ahead by using the services that social data provided. But now, Ameritrade seems to have the “art of the impossible” figured out and now their clients can see financial services in a new light, or, how Ms. Tuteja puts it … “sexy and fun.”

Finance 1

So how does one pull social and digital capabilities into the business? It was suggested that you look at the situation holistically and approach social with the underlying purpose of helping the organization truly connect with their clients as well as employees. Turns out, social is the “new 1-800 number,” listen to the good, bad, and ugly which can turn candidates into employees and prospects into clients.

Robert Passarella, President at Protege Partners, responded with a fun Caddyshack reference about the Dalai Lama and jumped into the conversation as well. Passarella spoke of the few turning points in the short social history that led to the financial institutions participation in the digital age. One of which was the “cash tag” in Twitter, which allows for investors to search for company tickers straight through the search feature. Another crucial turning point was in 2008, when social finance really took off with the market crash. After all, what can you say, people wanted information about their finances and they wanted it quickly.

Brian Wright, the Social Media Analysis and Command Center Leader at Wells Fargo, also had a few words of wisdom to contribute to the conversation. When asked the question “At Wells Fargo, do you run into one department wanting to own all of the social data?,” Wright responded that there is a strong focus on centralizing the data flow. That includes bringing in the core of excellence model but while also allowing for flexibility in the process. He stated that, by all being on the same platform, there is a single source of truth for clients and employees alike. This continuation of centralizing and integration can only serve to assist large organizations when dealing with economies of scale and bringing processes to the next level.

What this all comes down to is that traders and investors have been gravitating towards and around Twitter for quite some time now – nearly 10 years, which, in the digital world, can seem like a century. Now when news breaks on Twitter, investors can look for a quick snippet of information for their next buy, one that comes in 140 characters or less. But for financiers looking to gain more value from their business practices, the key lies in genuine connection. The deeper an organization can delve with the relationships that they hold with their users have the potential to create more win-win situations for everyone, creating increased value.

Defend and Respect the User’s Voice

A panelist in the “Connecting with Customers” panel highlighted the approach of “do first, ask permission later.” But Jarred Taylor, Product Counsel at Twitter, proposed a different approach in this panel: “do and ask for permission at the same time.”

Legal 2

What does that mean? At Twitter, it takes the form of a “product counsel,” a concept developed at Google approximately a decade ago. A product counsel pairs legal counsel with product teams to both do and ask for permission at the same time. Jarred explained that this approach means companies like Twitter no longer just rely on Terms of Service and law to protect user privacy. Instead, a product counsel can help to develop a product where the user can understand the security and privacy tools as they use the product day to day. A product counsel balances existing law with user trust.

Historically, companies have addressed security and privacy with “Put something in your terms of service and you’re covered.” Now, however, user interface and user experience are as important as the information in a terms of service document.

When asked how the Twitter platform has evolved to be safer for users, the unanimous answer from the panel was this: User safety is constantly shifting territory. What qualifies as a “negative user experience” on the platform is always changing. It’s easier to handle experiences that are manifestly abusive than it is to handle situations where it’s not clear whether it’s abuse or not. For example, what is considered hate speech versus what is considered freedom of speech varies from person to person, across cultures, and from country to country.

“The best thing we can do is give users control over their experience,” explained Yoel Roth, Product Trust at Twitter. He explained that the tools for safety—such as the “block” and “mute” buttons—must be easy for the user to understand. If the customer gets it, then the customer can have more control. By considering privacy and safety as new products and features are being created, Twitter can proactively address security and privacy issues.

Legal 1

A takeaway for other companies is to build legal counsel into product development early. Combining an early product development approach with thoughtfulness about the user experience can position a company to better defend and respect the user’s voice.

Connecting with Customers

This panel highlighted how big brands are using social data, across the organization, and especially to connect with customers.

Customers 1

Panelist Jayadev Gopinath, Chief Data Officer at Toyota Motor North America, gathers, manages, and analyzes data from two main sources: social data from various channels, and industry data purchased from organizations like J.D. Power. This is used across the organization, from PR to product to customer support. It’s the customer team, though, that has the deepest engagement with social data. Jayadev’s driving question is, “How do you make data part of everyone’s day-to-day job?” The data must be actionable. According to Jayadev, the sooner that companies like Toyota can receive and resolve customer feedback, the more money they can save through product improvements.

Panelist Kriti Kapoor, Global Director of Social Customer Care at HP, Inc., leads change from within the customer group at her company as well. Although social data was historically on the periphery at HP, over time and through a lot of learning social data has centered itself inside the organization. Once the company recognized the power of social data, HP shifted its customer focus to gain insight from online communities, Facebook and Twitter.

In order educate groups across HP, Kriti has had over 1,000 conversations to teach her colleagues about the data and value of customer care using social data. “Social is breaking boundaries across departments,” she explained. The volume of data HP manages helps to break these boundaries: Kriti’s team responds to more than 100,000 inquiries per month, across Facebook, Twitter and 22 other channels, in 10 languages.

Social is breaking boundaries across departments. – Kriti Kapoor, HP

Moderator Justin De Graaf, Director of Data Strategy & Precision Marketing at The Coca-Cola Company, asked when and how often the panelists looked to external companies for support in customer care. Kriti’s and Jayadev’s responses had a common through-line: use external tools to monitor and collect data, but analyze and interpret the data inside the organization. Toyota uses various agencies, buys industry data, and applies machine learning to gather raw data. Those external resources are closely managed internally by Toyota teams—in that way, it’s not fully “outsourced.” Instead, it’s carefully managed and analyzed via the expertise inside the company. Similarly, HP uses tools like Radian6 to monitor data, and designs social care internally with a playbook that defines support practices, escalation paths, staff training, and more to ensure not just response to customers—but engagement with customers.

When asked what pitfalls they’ve encountered in their customer care programs, Kriti’s and Jayadev’s responses were similar: When launching a new customer care program, run tests before pitching it to the executive team. Executives often don’t have familiarity with the social data and customer space, so presenting them with only an idea can lead to wasted time and resources. Prototype and test first. Sell internally later.

Kriti’s approach is: do first, ask permission later. Selling innovative concepts into an organization require storytelling with data. Kriti asks three questions when assessing new programs: Can we serve people at scale? Can we do it with quality? And lastly, can we do it economically?  “Executives are looking for people who will innovate their way to success, not cost-cut their way to success,” said Kriti.

 

Bringing Social Analytics In-House

History belongs to those who survive to write it, and data belongs to those who created companies to compile it. When companies have commodities they know to be hot, the market becomes their playground, and that playground can be prohibitively expensive and constricting. But what if companies could create their own data playground?

Mark Clarke from Unilever and Shawn O’Neal recently with the same company spoke to attendees about how their organization did just that: brought data compilation, analysis, and insights completely in house. Clarke noted the immediate value of bringing company data back to home turf, namely that companies don’t always want consumer data to be public. Data relating to sales, media spend, product trials, and the like can be sensitive and are meant to be confidential. When a company controls that data from where they sit, as opposed to using a software product or service to do it for them, they’re able to ensure data security and privacy on a deeper level.

Unilever.jpg

While Unilever is a global company with large budgets and plenty of offices serving as data sources and warehouses, creating their own in-house data center was not without its challenges. Each Unilever-owned brand, each global office, each function had to believe that data would really affect them in a positive way. How did the company get the buy-in from all these different parties? For starters, they made each of these offices and brands literally buy in: “We never gave our data away for free. Each brand had to make the decision to purchase the data from our mother company–that way, you know they’re invested in what the data can do for them,” said O’Neal. Clarke also explained the challenge of convincing these different department heads of the value of data: “On the one hand, you have digital marketers who are just on digital because they know their customers are there, so they feel like they don’t need to know anything else. On the other extreme, you have marketers who have been trained since traditional marketing school to only care about the ROI of any specific decision, and they don’t see how you can show them the ROI of investing in this data.”

But Unilever proved both parties wrong: by collecting data from surveys, social listening, and search listening, they’ve been able to determine that people talk about the same thing in different ways at different times. On the topic of stains, for instance, their Consumer Insights department found that when people filled out surveys about problematic stains they’d had in the past, they were often what Clarke jokingly referred to as “mythological memories” of stains past–the information might not be accurate. On social, people would talk about stains as they were happening: “Ketchup just dripped down my shirt; what do I do?” And still, on search, people were engaging about stains when the problem became a crisis: that ketchup won’t come out of the search. These insights, derived from their own in-house data center, helped brand managers and marketers understand how to market to consumers about stains in different environments, at different stages in the ketchup stain life-cycle.

Finally, Clarke and O’Neal noted that it is not necessary that an in-house data center operate entirely in-house. How is that possible? Unilever keeps the data oversight management and data consolidation functions in-house, but they outsource some of the bigger labor efforts: data collection and social listening, for example.

Keeping a data center in-house allows a company to have better control over their data, ensures an owned responsibility in security and privacy, and customizes insights important to them, instead of being stuck with the insights an outside product “believes” is most important. While not every company may have the option of building an in-house data team, Clarke and O’Neal provided a compelling case for trying to keep control of specific aspects inside the company, as opposed to entrusting that control to outside parties.