Welcome Day 2!

Day 2 of Big Boulder has officially started!

Screen Shot 2016-06-24 at 9.33.04 AMThe sun is shining, and it is already a beautiful Colorado day outside. Earlier this morning a few attendees were able to catch a glimpse of His Holiness the Dalai Lama as he exited the St. Julien Hotel, as apparently good company is kept here at the #BigBoulder Conference. 🙂 Other attendees participated in a nearby hike from Chautauqua National Historic Landmark up to Boulder’s signature Flatirons rock formations, capturing beautiful views of the city all the while. It is certainly shaping up to be a good one!

The day’s sessions kicked off with moderators Mark Josephson, the CEO of Bitly, and Farida Vis, Director of the Visual Social Media Lab at Sheffield University, discussing a few topics from yesterday and reiterating that the BBI channel on Slack is a key part of the BBI mission and focus for ongoing programming. The flow of information, as they say, is crucial for building community through Slack.

They closed their short introduction by mentioning that the Big Boulder Initiative would be having a discounted membership drive from now until next Friday (July 1st, 2016), with a BOGO membership costing $250. Then it was “down to business” again with the introduction of the first interview with Sara Haider, the Client Engineering Lead at Periscope.

If you would like a recap of Day 1 of BBI 2016, be sure to check out this link below!



Thoughts from Big Boulder – Day 1

Day 1 of Big Boulder 2016 is a wrap!

Emcee Farida Vis, Director of the Visual Social Media Lab, Faculty Research Fellow in the Information School at Sheffield University, shared her three take-aways from the day:

  1. The talks today revealed that there are a lot of topics our industry is getting better at. And there’s still a lot of stuff we’re not good at.
  2. The “Social Data and Politics” panel revealed that everyone is analyzing data, but many of the methodologies aren’t great. Yet.
  3. Building strong teams, tenacity,  being brave, and doing first/asking for permission later are concepts we toss around. The “Connecting with Customers” panel showcased real examples of someone walking the talk.

And in case you missed anything along the way, you can view of all the blog coverage from Day 1 of Big Boulder 2016 below:

We look forward to seeing you all again tomorrow, so stay tuned for more great presentation content from Day 2 of Big Boulder 2016!

Social Listening 3.0

Jonathan Farb, Chief Product Officer at ListenFirst Media, gave the second “Pecha Kucha” presentation of the day on Social Listening 3.0. To review, Pecha Kucha is a Japanese presentation style that focuses on the spirit of efficiency by having the narrator speak on exactly 20 slides, each of which is shown for only 20 seconds (6 minutes and 40 seconds in total).

Jon 1

Jon started out by saying that if you were to ask everyone in the room to define social listening, you would get a large number of different answers, including:

  • Sentiment
  • Positive/negative/neutral
  • Consumer reaction
  • Social media data
  • Forums

Jonathan asserted that “we sell social listening to tap into the world’s largest focus group” to understand what they’re doing, what they are saying, what they want to buy.

Up until recently, social listening has primarily focused on computers analyzing text. In the beginning, we had forums, BBSes, message boards, and similar tools. Social Listening 2.0 was able to structure conversation and topics in a way that supported enhanced analysis capabilities, such as measuring the volume, extracting the sentiment, and performing other text processing.

Then social listening changed again by adding “clicks,” such as clicking on icons to follow, like, or Retweet. It became immediately clear that not everyone was a content author and that some users liked to click. A lot. Tracking clicks allowed us to shift from the qualitative to quantitative and to analyze what someone was saying without them actually
having to say it.

The premise of Social Listening 3.0 is that conversation is not the entire story. If you are still evaluating the value of your brand based on text only, you are under-monetizing other engagements available to you. Not taking advantage of clicks (the rest of the story) is a real problem.

After conducting research with their clients, Jonathan’s firm found that less than 50% of fans actually post content. On the other hand, 100% of social media users click, providing valuable information that can be leveraged for ad strategies. For example, of 4.58MM digital engagements, 80% of the monetizable data for The Walking Dead was not analyzed.

You can also glean more insights than when analyzing clicks than what users actually type out and apply the insights to do optimization and competitive intelligence better. For example, Tesla is one of the lowest conversed brands on the Internet, but always in the top 5% with fan engagements.

The uptake is that if you’re still using conversation analysis as the only method for moving millions of dollars around, you’re obviously not getting the full story.

Social Data and Finance

To kick off this panel, David Schweidel, Associate Professor of Marketing at Emory University, asked a few general questions such as: “What are the best practices for brands as far as social media is concerned?” in addition to what the importance of “how some of these brands are leveraging social data”. How can an organization take stock of where they are at that moment in time, where their potential lies, and what factors are still holding them back?

Ms. Sunayna Tuteja, the Director of Social Media & Online Communities at TD Ameritrade, was the first to respond. She spoke of how Ameritrade is utilizing social data to serve their customers better because, for many years, those in financial services had to remain content with sitting back and peering enviously as companies in other industries were surging ahead by using the services that social data provided. But now, Ameritrade seems to have the “art of the impossible” figured out and now their clients can see financial services in a new light, or, how Ms. Tuteja puts it … “sexy and fun.”

Finance 1

So how does one pull social and digital capabilities into the business? It was suggested that you look at the situation holistically and approach social with the underlying purpose of helping the organization truly connect with their clients as well as employees. Turns out, social is the “new 1-800 number,” listen to the good, bad, and ugly which can turn candidates into employees and prospects into clients.

Robert Passarella, President at Protege Partners, responded with a fun Caddyshack reference about the Dalai Lama and jumped into the conversation as well. Passarella spoke of the few turning points in the short social history that led to the financial institutions participation in the digital age. One of which was the “cash tag” in Twitter, which allows for investors to search for company tickers straight through the search feature. Another crucial turning point was in 2008, when social finance really took off with the market crash. After all, what can you say, people wanted information about their finances and they wanted it quickly.

Brian Wright, the Social Media Analysis and Command Center Leader at Wells Fargo, also had a few words of wisdom to contribute to the conversation. When asked the question “At Wells Fargo, do you run into one department wanting to own all of the social data?,” Wright responded that there is a strong focus on centralizing the data flow. That includes bringing in the core of excellence model but while also allowing for flexibility in the process. He stated that, by all being on the same platform, there is a single source of truth for clients and employees alike. This continuation of centralizing and integration can only serve to assist large organizations when dealing with economies of scale and bringing processes to the next level.

What this all comes down to is that traders and investors have been gravitating towards and around Twitter for quite some time now – nearly 10 years, which, in the digital world, can seem like a century. Now when news breaks on Twitter, investors can look for a quick snippet of information for their next buy, one that comes in 140 characters or less. But for financiers looking to gain more value from their business practices, the key lies in genuine connection. The deeper an organization can delve with the relationships that they hold with their users have the potential to create more win-win situations for everyone, creating increased value.

Defend and Respect the User’s Voice

A panelist in the “Connecting with Customers” panel highlighted the approach of “do first, ask permission later.” But Jarred Taylor, Product Counsel at Twitter, proposed a different approach in this panel: “do and ask for permission at the same time.”

Legal 2

What does that mean? At Twitter, it takes the form of a “product counsel,” a concept developed at Google approximately a decade ago. A product counsel pairs legal counsel with product teams to both do and ask for permission at the same time. Jarred explained that this approach means companies like Twitter no longer just rely on Terms of Service and law to protect user privacy. Instead, a product counsel can help to develop a product where the user can understand the security and privacy tools as they use the product day to day. A product counsel balances existing law with user trust.

Historically, companies have addressed security and privacy with “Put something in your terms of service and you’re covered.” Now, however, user interface and user experience are as important as the information in a terms of service document.

When asked how the Twitter platform has evolved to be safer for users, the unanimous answer from the panel was this: User safety is constantly shifting territory. What qualifies as a “negative user experience” on the platform is always changing. It’s easier to handle experiences that are manifestly abusive than it is to handle situations where it’s not clear whether it’s abuse or not. For example, what is considered hate speech versus what is considered freedom of speech varies from person to person, across cultures, and from country to country.

“The best thing we can do is give users control over their experience,” explained Yoel Roth, Product Trust at Twitter. He explained that the tools for safety—such as the “block” and “mute” buttons—must be easy for the user to understand. If the customer gets it, then the customer can have more control. By considering privacy and safety as new products and features are being created, Twitter can proactively address security and privacy issues.

Legal 1

A takeaway for other companies is to build legal counsel into product development early. Combining an early product development approach with thoughtfulness about the user experience can position a company to better defend and respect the user’s voice.

Connecting with Customers

This panel highlighted how big brands are using social data, across the organization, and especially to connect with customers.

Customers 1

Panelist Jayadev Gopinath, Chief Data Officer at Toyota Motor North America, gathers, manages, and analyzes data from two main sources: social data from various channels, and industry data purchased from organizations like J.D. Power. This is used across the organization, from PR to product to customer support. It’s the customer team, though, that has the deepest engagement with social data. Jayadev’s driving question is, “How do you make data part of everyone’s day-to-day job?” The data must be actionable. According to Jayadev, the sooner that companies like Toyota can receive and resolve customer feedback, the more money they can save through product improvements.

Panelist Kriti Kapoor, Global Director of Social Customer Care at HP, Inc., leads change from within the customer group at her company as well. Although social data was historically on the periphery at HP, over time and through a lot of learning social data has centered itself inside the organization. Once the company recognized the power of social data, HP shifted its customer focus to gain insight from online communities, Facebook and Twitter.

In order educate groups across HP, Kriti has had over 1,000 conversations to teach her colleagues about the data and value of customer care using social data. “Social is breaking boundaries across departments,” she explained. The volume of data HP manages helps to break these boundaries: Kriti’s team responds to more than 100,000 inquiries per month, across Facebook, Twitter and 22 other channels, in 10 languages.

Social is breaking boundaries across departments. – Kriti Kapoor, HP

Moderator Justin De Graaf, Director of Data Strategy & Precision Marketing at The Coca-Cola Company, asked when and how often the panelists looked to external companies for support in customer care. Kriti’s and Jayadev’s responses had a common through-line: use external tools to monitor and collect data, but analyze and interpret the data inside the organization. Toyota uses various agencies, buys industry data, and applies machine learning to gather raw data. Those external resources are closely managed internally by Toyota teams—in that way, it’s not fully “outsourced.” Instead, it’s carefully managed and analyzed via the expertise inside the company. Similarly, HP uses tools like Radian6 to monitor data, and designs social care internally with a playbook that defines support practices, escalation paths, staff training, and more to ensure not just response to customers—but engagement with customers.

When asked what pitfalls they’ve encountered in their customer care programs, Kriti’s and Jayadev’s responses were similar: When launching a new customer care program, run tests before pitching it to the executive team. Executives often don’t have familiarity with the social data and customer space, so presenting them with only an idea can lead to wasted time and resources. Prototype and test first. Sell internally later.

Kriti’s approach is: do first, ask permission later. Selling innovative concepts into an organization require storytelling with data. Kriti asks three questions when assessing new programs: Can we serve people at scale? Can we do it with quality? And lastly, can we do it economically?  “Executives are looking for people who will innovate their way to success, not cost-cut their way to success,” said Kriti.


Bringing Social Analytics In-House

History belongs to those who survive to write it, and data belongs to those who created companies to compile it. When companies have commodities they know to be hot, the market becomes their playground, and that playground can be prohibitively expensive and constricting. But what if companies could create their own data playground?

Mark Clarke from Unilever and Shawn O’Neal recently with the same company spoke to attendees about how their organization did just that: brought data compilation, analysis, and insights completely in house. Clarke noted the immediate value of bringing company data back to home turf, namely that companies don’t always want consumer data to be public. Data relating to sales, media spend, product trials, and the like can be sensitive and are meant to be confidential. When a company controls that data from where they sit, as opposed to using a software product or service to do it for them, they’re able to ensure data security and privacy on a deeper level.


While Unilever is a global company with large budgets and plenty of offices serving as data sources and warehouses, creating their own in-house data center was not without its challenges. Each Unilever-owned brand, each global office, each function had to believe that data would really affect them in a positive way. How did the company get the buy-in from all these different parties? For starters, they made each of these offices and brands literally buy in: “We never gave our data away for free. Each brand had to make the decision to purchase the data from our mother company–that way, you know they’re invested in what the data can do for them,” said O’Neal. Clarke also explained the challenge of convincing these different department heads of the value of data: “On the one hand, you have digital marketers who are just on digital because they know their customers are there, so they feel like they don’t need to know anything else. On the other extreme, you have marketers who have been trained since traditional marketing school to only care about the ROI of any specific decision, and they don’t see how you can show them the ROI of investing in this data.”

But Unilever proved both parties wrong: by collecting data from surveys, social listening, and search listening, they’ve been able to determine that people talk about the same thing in different ways at different times. On the topic of stains, for instance, their Consumer Insights department found that when people filled out surveys about problematic stains they’d had in the past, they were often what Clarke jokingly referred to as “mythological memories” of stains past–the information might not be accurate. On social, people would talk about stains as they were happening: “Ketchup just dripped down my shirt; what do I do?” And still, on search, people were engaging about stains when the problem became a crisis: that ketchup won’t come out of the search. These insights, derived from their own in-house data center, helped brand managers and marketers understand how to market to consumers about stains in different environments, at different stages in the ketchup stain life-cycle.

Finally, Clarke and O’Neal noted that it is not necessary that an in-house data center operate entirely in-house. How is that possible? Unilever keeps the data oversight management and data consolidation functions in-house, but they outsource some of the bigger labor efforts: data collection and social listening, for example.

Keeping a data center in-house allows a company to have better control over their data, ensures an owned responsibility in security and privacy, and customizes insights important to them, instead of being stuck with the insights an outside product “believes” is most important. While not every company may have the option of building an in-house data team, Clarke and O’Neal provided a compelling case for trying to keep control of specific aspects inside the company, as opposed to entrusting that control to outside parties.

Stop Doing Single Platform Analytics

Eric Bell, a Natural Language Processing Software Scientist at PNNL, gave a “Pecha Kucha” presentation on the value of analysis across multiple media types. Using this fast-paced presentation style, Eric spoke on exactly 20 slides, each of which was shown for only 20 seconds (6 minutes and 40 seconds in total).

He started by explaining the differences between how we analyze people versus how we analyze companies. With people, we ask questions like “What do they eat? What sort of pictures do they share?” Then we take information derived from different bubbles and mash it together to make a statement. However, we wouldn’t use that method to characterize a company—we see an organization as a much more complex entity with various departments, levels, and sophisticated moving parts.

By way of example, he showed the following Tweet that he recently posted about his cat:Eric Bell 1

To analyze a Tweet, we typically look at its text, the terms, #hashtags, Twitter handles, and other attributes. We might also try to understand the Tweet’s audience—who is interacting with the post and how that looks over time. We might also analyze the image of his cat via the bit.ly link.

Cat 1

Now we’re starting to learn something about the author and why he made the post. We might also go on to analyze other Tweets and images that this author put up to form a picture of what he likes, what he is interested in, how he interacts.

Eric’s presentation then segued to the question, “Why do we perform analysis in a silo’d, isolated fashion?” We produce reports and charts to make decisions on the basis of data, but we do that by analyzing each individual channel separately and analyzing traditional media apart from social media. He asked whether there is value in trying to bring the diverse data types together. While we suspect there is great value in being able to do this, we see it as a daunting task.

Eric went on to ask whether we can be more broad about the way we think about digital identity and social analysis. Can we blend traditional and social media together? Instead of using an algorithm on a single platform, could we use it on multiple platforms? If we could do this, our decisions would be much more informed, with a deeper and richer understanding of context. In some cases, it might lead us to ask a new question about traditional media or get an insight on something we previously were not able to.

Eric ended his presentation with a challenge to the audience, the “brains in the room.” Can we change the way we have traditionally done social analysis and bring value and impact to the world of social data? If the brains at Big Boulder can’t figure this out, perhaps no one can.

Social Data and Politics

It’s undeniable that unfettered access to social media has not only given each person a voice, but that the importance of that voice has helped to change the political landscape. When candidates and lawmakers are able to interact directly with voters and constituents, how does social data currently play into the way politics are done?

Currently, 61% of American adults are using Facebook and many of them are discussing politics. However, even with all this access to new information, Katherine Haenschen of Princeton University notes that it can actually create a counterintuitive effect: the more exposed people are to divergent political beliefs, the less likely they are to talk politics, or even vote. However, the more involved they are with politically like-minded people, the more likely they are to be a part of the political process: social media should be creating a more informed citizenry, but that diversity of information is having the opposite desired effect.

Not only is the information overload problem a real one, but Sean Evins of Twitter spoke specifically about the type of information access we have, which is real-time, all the time. “You’re able to see the debates play out, you can connect with the candidate at the rally, even if you’re not at the debate or rally.” Evins noted that when C-SPAN access to the congressional floor was shut down last night, both constituents and lawmakers turned to platforms like Twitter and Periscope to see what was happening unfold in real time. Unmediated access has become the norm.

Politics 1

First was the policy platform and stump speech. Soon, the 24-hour news cycle changed the way people heard politics to newly formed “sound bites”, easily digestible info-bits, rapid fire, one after another. Now, the top source of information for most people is Facebook, creating what Aaron Rodericks of the National Research Council Canada calls “reactionary-based politics. Click, then share, then comment: how can a candidate do such a thing?

Politicians are getting in on the action as well, even at the local level: Take a picture here, create a Facebook meme there, make a Snapchat story of the rally, live-Tweet this debate or throw shade at your political competition in 140 characters or less. Where petitions used to be door-to-door, 1 million signatures can now be gathered online in a matter of hours.

So what’s next? Haenschen and Rodericks both note that the standards and methodologies for gathering, analyzing, and interpreting social data must improve if the data is to be used in political arenas; with users coming and going, deleting their posts and tweets on occasion, it can be a challenge to determine if a realistic sample size has been collected. On the other hand, Evins observed that Twitter is changing the way polling data is gathered overall: no longer are parties limited to geographically-based phone banks. Instead, they can gather extremely targeted demographic data all over the nation to see what their constituents want, what makes them tick, and what will trigger them to get involved in the process. This kind of targeting will prove valuable to politicians of all stripes trying to tap into the collective consciousness of the country.

Innovation in Public Policy Analysis

“Looking at the data from different perspectives and different sources gives us more accuracy,” said Pedro Lenhard, Researcher and Creative Analyst, Department of Public Policy Analysis at the think tank Fundação Getulio Vargas (FGV). This deceptively simple statement captured a conversation about how to turn dense government information into easy-to-understand data for the public.

Pedro leads a diverse team of PhDs, mathematicians, graphic designers, and analysts to uncover trends in public policy in Brazil at FGV, building easy-to-understand infographics and interactive visualization tools for the government. “A think tank can feel very academic, but FGV provides this data directly to the public,” explained Randy Almond, Head of Data Marketing at Twitter. The mission of Pedro’s group is twofold: to raise important questions about society and improve the quality of public policy “so that people can demand more from their government,” Pedro explained.

Pedro Smile

One project captured the public’s use of emojis and emoticons across Brazil to understand people’s moods. The simply visualization was a tool FGV used to attract people to the company’s website, and allowed people to experience social data in a fun way.

FGV transforms traditional public policy topics and opens them to non-traditional audiences. One FGV project expanded the conversation on immigration in Brazil. Where Brazilian immigration law traditionally focused on the role of human rights, FGV used data about migration patterns to uncover that Brazil has not attracted many highly-skilled professionals. By viewing migration patterns around the world, FGV was able to inform new law on immigration policy.

The 2014 elections in Brazil were accompanied by political upheaval, including demonstrations and protests. During that time, FGV developed the data into a visualization that tackled a few topics: which parts of the public hold divergent view about candidates, which groups hold similar views, how people changed sides throughout the the election process, and which individuals influenced the political debates. Pedro explained, “It’s not a view of the whole, but it’s a snapshot of a fraction of society—and that fraction tells us something.”

Pedro’s and his team’s work shows how dense data can be made into data that a wide variety of people can understand. It’s not simply a repackaging of data in a visual way—it is a new experience of social data. “It takes something that’s complex and makes it into something we can enter into,” said Randy.

*Update: You can find links to some of the work that Pedro and his team produced below.

1) Social network analysis // 2014 Presidential Election – https://vimeo.com/124409337 

2) Emoji map // Mood of Brazil tool – dapp.fgv.br/lab/humor-na-rede
Twitter handle – @fgvdapp