In the U.S., 96 percent of all civil cases are resolved short of a jury trial. Along the way, however, costs associated with litigation discovery (e.g., preservation, collection, review, production) are financially burdensome to both the plaintiff and the defendant. In many instances, the opposing party will even strategize on how to make it as difficult as possible to comply with the eDiscovery process, including time and cost to respond to discovery requests. How do you know what your risk is? Should you settle? Do you have a sound legal strategy?
These are the kinds of questions that litigators must ask themselves during the very first phase of the eDiscovery process. They have a crucial need to determine their risks/benefits of taking a case to trial versus entering into what is often a series of protracted settlement discussions. The most pressing issue at hand — developing a winning litigation strategy — should not get lost in a discussion about data management.
This is the second post in a two-part series regarding the evolution in the way that the litigation technology industry has been searching for consensus over distinctions between a pair of key concepts in eDiscovery: Early Data Analysis (EDA) and Early Case Assessment (ECA).
In our previous blog post on this subject, we reviewed how the proliferation of eDiscovery software tools has transformed ECA into more of a data management exercise. These tools — many of which were developed in response to rising concern from in-house counsel about the soaring expenses of preservation, collection and review — gradually shifted the focus from case assessment to data assessment.
But a growing number of industry experts are observing that a new generation of technology has arrived to make the EDA/ECA distinction meaningful at last. Specifically, we now have software tools available that can empower litigators to make better decisions about their cases by having more useful insights at earlier stages in the development of litigation strategy.
“A new approach to eDiscovery technology development should give a litigation team the tools to identify clear litigation objectives and then build a plan that is focused on achieving those goals,” said Steve Ashbacher, vice president of litigation solutions with the LexisNexis software and technology business. “It should provide a way to analyze the case by seeing the full story so you can more clearly see what your case is trying to tell you.”
This new approach would basically revolve around the idea that eDiscovery should not be conducted based on a linear review continuum, but rather through the lens of early case assessment, according to Ashbacher.
“Remember, the primary purpose of EDA is not to learn every fact or review all documents, but rather to gain enough information about the stakes in front of you in order for the litigators to decide how to proceed,” he said. “The more that your eDiscovery software tools can help you obtain that knowledge, the less money and time you will need to spend on setting your case strategy.”
Advanced search engines power the new generation of eDiscovery software tools, which feature visual analytics that are designed specifically for the rigors of eDiscovery. This new technology disrupts the traditional linear review model, from PreDiscovery to production, by rethinking the role of ECA and providing powerful analytics throughout the eDiscovery process.
One example is Lexis DiscoveryIQ, a fully integrated enterprise eDiscovery software platform developed by LexisNexis and enhanced by Brainspace Corp.
“With Lexis DiscoveryIQ, LexisNexis and Brainspace are challenging the status quo,” said Ryan Bilbrey, managing director of OmniVere, LLC, and a member of panel of advisers for the development of the new platform. “Disruption is a term that’s tossed around almost callously in legal technology circles these days, but in this case, here is a team that has reimagined the eDiscovery process to help litigation teams achieve vastly better results.”
The result is a comprehensive platform, focused on ECA, which brings to bear the most advanced eDiscovery technology and analytics tools available. This and other new eDiscovery software tools are shattering the myth that visual analytics and other sophisticated tools can only be used on “bet the company” matters by making them cost-effective for any size of case.
The “EDA vs. ECA” conversation is not new in our industry; thought leaders in eDiscovery have been surfacing this confusion in terminology for a few years now. But here’s what is new: this conversation is no longer just a deep dive into IT minutiae for data scientists.
“Advanced technology solutions have now been commercialized in a way that makes the distinction between these two things crucial for litigators to understand,” said Ashbacher. “Litigation teams — both in-house and outside counsel — can now put these software tools to work in a way that provides them with earlier insights into their cases, resulting in better litigation strategy at reduced costs to their clients.”
We often hear that it’s important for a business product or service to be “scalable” — in the world of information technology, as TechTarget defines the term, that means the software will “continue to function well when changed in size or volume in order to meet a user need.”
One global provider of legal and business solutions has proven that truly scalable eDiscovery tools can be crucial to delivering high-quality results to clients at more affordable price points.
UnitedLex is a leading global provider of legal and business services that integrate strategy, consulting, technology and operations to deliver solutions that address their clients’ most complex business challenges. The company has more than 2,000 employees — attorneys, engineers, financial analysts, and consultants — based in 22 offices across seven countries.
According to Joshua Tucker, assistant manager for EDD Operations at UnitedLex, the firm works with government agencies, law firms and corporations who are faced with the challenge of managing electronic data in the face of investigations or perhaps litigation.
“Sometimes they do internal audits and sometimes they’re just preparing in case anything happens,” said Tucker. “We can advise them on how to collect and hold their data so that it’s meeting with the standards of their state. And then if something happens, we preemptively discover it for them so that we can give them the best advice before or during a lawsuit.”
In order to provide robust consulting solutions, UnitedLex needs to rely on effective technology tools. After a disappointing experience with one eDiscovery platform, the firm asked Tucker and his colleagues to review competitive tools and find one that would work better for their specific needs.
“LAW® PreDiscovery® was the most versatile of all of them, gave the best results and could be built upon,” said Tucker. “UnitedLex went forward with LAW because we were able to get the most bang for the buck.”
LAW PreDiscovery is an advanced imaging and eDiscovery software application from LexisNexis. LAW PreDiscovery helps litigation professionals take control of eDiscovery early in case analysis so they can cut millions of documents down to size prior to costly review. LAW is compatible with almost any type of file format, allowing users to import documents, and then export directly into leading document review platforms in other formats. It can handle even the largest imaging, endorsing, OCR or print jobs simply by adding additional workstations.
“We are handed something that’s a cluster and we make it work,” said Tucker. “With LAW, we can tackle any size imaging and OCR job to deliver the best product to our clients.”
Because LAW is compatible with other leading programs, Tucker’s team is able to quickly integrate it into their workflow and save the day for clients who were unhappy with other vendors’ work. In addition, LAW is scalable for any size job, “whether you’re processing three files or three terabytes,” he said.
To learn more about how UnitedLex leverages scalable software tools to deliver the best possible eDiscovery solution to its clients, please click here.
I don’t need to tell you how frustrating it can be to work hard on a blog post, infographic, podcast or SlideShare, only to notice later that it didn’t get as many shares as you would have hoped. Every content marketer has been there, especially when starting out.
You also know that publishing the article is only the first step and you’ve done some legwork to promote it. You’ve shared it to your social pages, you’ve sent out an email blast to your users. And it works! After a couple of days, you get a hundred social interactions. After a week, maybe even a couple hundred interactions. Everything seems to be in order.
Then you visit one of your go-to resources for all of your marketing insights and look at their most recent blog post. It has 2k shares and it’s only been up for a couple of days.
Before you get too discouraged, take this into consideration: how big is your audience?
Because if your audience is significantly smaller than many influential marketing tech companies, you might need to cut yourself some slack.
Don’t beat yourself up over not hitting the same numbers as the big guys. Your content might be doing better than you even realize. To see how well your content is actually performing, you need to normalize your shares.
Comparing shares for your content with a site that has a huge audience isn’t the most productive way to go about benchmarking your content. Of course their numbers will be higher. What you need to do is to look at how your posts are performing relative to the size of your audience. This is called normalizing your shares.
And no, this is not just a way to make excuses for low performance–it’s a way to be realistic. Once you know what numbers you should realistically be hitting for the size of your audience, you can set achievable and scaleable goals.
Here’s the cold hard truth: odds are that unless your site is one Inbound’s top 115 MarTech companies, your content won’t get even a fraction of the interactions that theirs do. That’s because content follows the power law: the amount of content that is shared and even read disproportionately favors only the top 10% of the best performing sites. The majority of content gets zero to thirty shares. So if your site isn’t in the top 10%, your content will go virtually unread.
But there is a silver lining to this truth. If your content has even a dozen meager social interactions, that means that it’s already performing better than the majority of online media out there. And if you’re a conscientious marketer, then you will probably average more than twenty social interactions per article. Chances are you’re already off to a good start.
How to Normalize Your Shares
There are three steps to this process:
Collect your shares.
Normalize your shares.
Analyze your shares.
Collect your shares It’s prudent to not only look at your total shares, but to also look at your shares separately by social media platform. This way, you can identify gaps in your social media strategy, to see which channels you need to work on.
Shares per post on Facebook, Twitter, LinkedIn, Instagram, Pinterest (anywhere you have a social presence, no matter how small)
Blog shares per follower (this is where normalizing your numbers comes in)
Types of blog posts (listicles, infographics, how-to guides, videos, etc. as well as content length)
Use a tool like BuzzSumo to collect metrics from over different periods of time. To create benchmarks, we looked at metrics for a six-month period and a single month period. These time periods will give you a better understanding of the patterns and trends of your content’s traffic.
Next, go to your social media pages and record how many followers you have on each.
Put these metrics into a spreadsheet.
For example, here are the Buffer blog’s followers and shares over a six month period (July 1, 2015 — December 31, 2015).
Normalize your shares
Normalizing your shares is very simple. Just divide the number of social shares on a blog post by the number of followers on that particular social media platform in order to get your normalized share percentage. This will tell you how engaged your audience is.
Returning to Buffer, consider their average Twitter shares per blog post over a six month period: 908. Not bad. But now, normalize those shares: 908/579,000 = 0.16%. That’s a very low rate of shares per follower.
Or take another another example: HubSpot, who have a very high average number of shares for each blog post. Over a six month period (July 1, 2015 — December 31, 2015) HubSpot published 1604 posts (the second highest post count of 115 companies).
Their average total shares over six months was 830 shares. That’s way higher than most other MarTech sites–an average of 166 shares per post on Facebook and 349 shares per post on Twitter. But considering their followers on Twitter and Facebook alone exceed 1.5 million, their normalized share count isn’t actually that high: 0.05% on Twitter and 0.015% on Facebook.
Now you’ve got some perspective.
Analyze your shares Once you’ve normalized your shares on a month over month basis and on a multi-month basis, you can look for places where your content strategy can be improved.
Shares by Social Media Platform
Your analysis will reveal which platforms have a more engaged audience and which platforms you may need to target more.
For example, you might notice that your shares on Facebook are substantially lower than your shares on Twitter. This might cause you to look into your audience on Facebook and find a substantial community that you haven’t been targeting! Now you can adjust your content strategy to better target Facebook.
Average Shares by Content Type
Your analysis might reveal that certain type of content performs better than other forms of content–for instance, posts featuring infographics versus posts without data visualizations.
BuzzSumo’s Content Analysis tool allows you to look at the average shares per content type. Take a look at an analysis of Hootsuite blog’s content over a six month period.
As you can see, “Why Posts” and “How Articles” are the most shares types of content–content that is actionable and explanatory.
Number of Shares Per Unique Page Views
Another way to slice and dice your information is to look at the number of shares per unique page views on a specific article, and on your articles over a period of time. This will give you another insight into how engaged your audience is, and what type of content 1) gets more views and 2) converts to shares. To get this metric, you can use Google Analytics. For example, take a look at this sample from BuzzSumo’s blog:
As you can see from this sample, more page views does not necessarily denote more shares. The article titled “Our First Year as Content Marketers at GetApp” has the lowest number of unique page views but by far the greatest number of shares per unique page views. This means that the audience that did view the article, despite being smaller than the rest, were far more engaged.
Once you see which posts had the greatest audience engagement via shares per unique page view, you can cross-examine the post’s traffic and break the metrics down by post type and by normalized shares per social media page.
When analyzing your metrics, it can be helpful to visualize your findings. Because I’m a visual learner, I find that visualizing data helps me find patterns and draw connections. Visualizing your data is easy if you use a simple chart maker.
Benchmarks for Shares on All Networks
Here are benchmarks for content shares, based on an analysis of Inbound’s list of the top 115 MarTech companies. We assigned a percentile and a corresponding letter grade to the average number of shares. This way, we could see how well our content was performing in comparison to the other content out there. For example, if your posts gets enough shares to earn you a B grade, that means your content is performing better than 75 percent of other content.
Here’s the benchmark for total shares:
We also benchmarked shares for each social media platform:
Here is the benchmark for Facebook shares per single blog post. As you can see, in order for your content to perform better on Facebook than 90% of other content, you need to get at least 144 shares.
Now, below is the benchmark with normalized shares. What we found was that shares on Facebook for a given audience were actually quite low.
This benchmark is for Twitter shares per single blog post. As you can see, there are generally about triple the shares on Twitter than on Facebook.
And here are the normalized shares for Twitter.
Monthly Total Shares
While it’s useful to identify places for improvement on a platform by platform basis, it’s also important to not lose sight of the bigger picture. That’s why we also benchmarked the average total monthly shares for the top 115 MarTech companies. If you know what total share count you should be targeting month over month, you can then look at ways to achieve that in your content strategy.
This could mean doing more outreach for your content, targeting a specific social media platform, publishing more content in a month’s time, or any combination of factors.
Again, you can normalize these total monthly share counts to get a realistic snapshot of your blog’s performance. Just divide the total number of shares by the total number of followers. Look for correlations between your content’s performance on individual sites and the total monthly performance.
Set Realistic Goals
The purpose of this article is not to encourage you to be comfortable with your blog performance as it is–it’s to give you an actionable way to scale your content. Not only should you look for ways to gain new followers, but you should also look for ways to better engage your existing audience.
Being active and engaging on social media is something that a lot of companies struggle to do effectively. That’s why it’s so important to understand how engaged your audience is and where that engagement is happening. I hope this article has been helpful in pointing you in the right direction. Feel free to leave any questions in the comments.
Today, we riff on one of the key tactics Woodyear recommends that firms adopt in order to optimize the results of their digital strategy — the use of LinkedIn.
Not unexpectedly, the growth in General Counsels’ reliance on social media to vet law firms and lawyers mirrors the growth in the use of social media by law firms and lawyers themselves. According to surveys reviewed by Woodyear, between 60 to 73 percent of GCs check out lawyers and their firms online.
A LinkedIn-based Social Media Strategy — A 5-Step Plan for Success
With so much riding on a digital first impression, Woodyear recommends using a presence on LinkedIn as a cornerstone of a firm’s social strategy. She offers the following 5-step plan for firms to ensure that their LinkedIn-based social strategy is on track.
Make sure the majority of the firm’s lawyers are on LinkedIn Take advantage of the synergistic effect of having multiple lawyer connections associated with your firm. Encourage all of your firm’s lawyers to create a profile on LinkedIn, offer assistance with developing a suitable bio, if necessary, and follow up to confirm participation.
Encourage lawyers to create bios optimized for social media Simply cutting and pasting information from a resume or CV can result in a LinkedIn profile that ranks low in searches. Lawyers should highlight credentials, specialized skills and accomplishments that correspond to how clients and prospects search for a lawyer. Optimizing bios to rank high in online search results isn’t difficult, but it requires a laser-sharp focus on identifying and deploying the most powerful and relevant terms in the lawyers’ bios.
Ensure that lawyers are actively engaged in connecting with colleagues and clients The difference between merely setting up a LinkedIn account and leveraging LinkedIn to become a social media thought leader is simply a matter of concentrating one’s efforts on creating, sharing and curating relevant content. Once a lawyer has established a targeted audience of colleagues, clients and prospects, it’s important to engage the audience. Nothing beats informative and useful content served up on a regular basis for fulfilling that role.
Emphasize the importance of sharing thought leadership with connections on a regular basis Developing a consistent cadence of social posts helps build and sustain an audience that’s not only engaged, but which looks forward to each new piece of relevant thought leadership. When creating content becomes a part of the lawyers’ daily or weekly routine, the process will become increasingly less burdensome and may actually become a task he or she looks forward to.
Assess what content results in the most engagement By focusing social media efforts on the types of content readers find most interesting or thought provoking, you can fine-tune your social strategy to yield the best return on your social media investment.
A Tool for Monitoring Brand Strength
As Woodyear mentioned in her previous post, Kredible is a useful tool for helping lawyers effectively use LinkedIn to strengthen their personal brand as well as the firm’s brand by optimizing online presence. The scoring Kredible provides helps lawyers focus on making sure the right information is being shared, and that the firm’s brand guidelines are being followed. Kredible also provides analytics that deliver insights into the effectiveness of a content strategy.
Create, Curate, Measure and Refine
Woodyear emphasizes the importance of lawyers strategically curating content for distribution to their clients. Because even the most interested clients or prospects will likely spend only a few seconds scanning content, the most important messaging has to stand out. That means avoiding long bodies of text and instead, delivering content in a user-friendly and engaging format.
Once the digital strategy, content and delivery mechanics have been fine-tuned, Woodyear recommends leveraging the content by creating targeted LinkedIn campaigns to promote thought leadership to a very targeted audience. “LinkedIn campaigns are a very cost effective and targeted way to make sure the attorneys’ thought leadership is being seen by the intended audience,” Woodyear says. She suggests experimenting with different types of content, infographics, and various headlines, and then leveraging LinkedIn’s analytics to identify the top performing content. One of the measurement tools LinkedIn offers is the Social Selling Index. Updated daily, the index measures how effective an individual is at establishing a professional brand, finding the right people, engaging with insights, and building relationships.
Woodyear is an advocate for making use of the power of analytics to shape content strategy. “Before each campaign, you should clearly define your objectives and KPIs to measure its success. You also be sure to review metrics regularly with key management. If you collect lots of data and no one reviews it, than its pretty useless,“ she cautions.
“Use data to refine and adjust your processes. Over time, you’ll be able to predict what content works best, what events are most successful, what pitches you’ll win, what type of client is most profitable. Ultimately, you’ll be able to effectively use data to uncover new opportunities and prospect for additional business.”
“You can’t boil the ocean,” said Kris Satkunas, director of strategic consulting for LexisNexis CounselLink, during a webinar focused on driving operational efficiency in the legal department. Instead of trying to accomplish everything at once, she suggested honing in on a handful of metrics that clearly identifies what is most important to the legal department.
“The legal department is a cost center we have heard,” Satkunas said. “Using data related to cost savings can help demonstrate the value of the legal department to the organization.”
During the discussion she offered three examples of real-life legal departments in various industries in which the legal department used a well-defined, systematic process to better manage its outside legal partnerships in the areas of price, panel selection, and matter budgeting. Here’s a closer look at one anecdote she shared:
The Insurance Company
A large insurance company sought to put a data-driven law firm assessment process place by using a scorecard approach to better determine pricing of matters. In order to accomplish this, the legal department first identified what attributes were most important to the legal department in an outside counsel partner. In doing so, the legal department realized that some of the attributes it deemed to be most valuable were more subjective in nature, such as:
How well does the law firm communicate with the legal department?
How creative is the law firm in its approach?
Once the legal team settled on a handful of key attributes against which to measure the department’s outside legal vendors they quantified them on scale of 1 to 4 to develop a scorecard. As a result of developing a scorecard approach, the legal team is now able to draw comparisons and make more data-driven decisions on how well their legal vendors perform in these areas. In essence, the legal department now uses a systematic, closed-loop process by which to measure outside counsel partners and makes direct comparisons on matter pricing. In addition, it uses the feedback it gets from the process to share with its outside legal vendors and let them know where they stand. As an added bonus, the legal department’s relationships with its outside legal vendors are improved and the department is headed towards a much more data-driven vendor management process.
“Select the best metrics that are most important to you,” said Satkunas.
Enterprise Legal Management (ELM) solutions that route information and documents related to legal matters and automate tasks that people might otherwise have to manage manually, is an example of how technology can be used to drive maturity and allow staff to attend to more core work responsibilities. In addition, ELM solutions can help the legal department communicate more efficiently and effectively, internally and externally with their law firm partners.
Analysis is another essential part of the maturity process. The discovery and communication of meaningful patterns of data can be used to make more informed decisions related to matter cost against the legal budget. By looking at consistency of outcomes, legal departments can better understand and forecast the various phases of the matter life cycle.
Process. Quite simply, process refers to the number of steps or actions the legal department needs to take to reach a particular end. Determining budgets or assessing outside law firm panels are both examples of actions that require a detailed process.
Bringing these three things together can get the legal department to the best place, Satkunas says.
Although Satkunas says there is no right or wrong place to be on the maturity scale, she encourages legal departments to advance as close to the maturity side as they are ready to go.