The Future is More Content: Jeff Bezos Robots and High Volume Publishing

Which of these sums up your view on content production?

“Content is about quality, not quantity. We should be producing high value, authoritative content regularly, not publishing lots of short posts. Less is more.”

“Winning in digital media now boils down to a simple equation: figure out a way to produce the most content at as low a cost as possible.” (Digiday 2013)

Do you agree with first statement? Me too, until recently. But now I think we could be wrong.

The Washington Post now publishes around 1,200 posts a day. That is an incredible amount of content. My initial reaction when I read the statistic was ‘surely that is too much, the quality will suffer, why produce so much content?’ The answer seems to be that it works. The Post’s web visitors have grown 28% over the last year and they passed the New York Times for a few months at the end of 2015..

The growth in published content is just one part of the long term strategy that Jeff Bezos has put in place to drive up the Post’s audience. The content strategy appears similar to the approach Bezos adopted at Amazon for long tail audiences. Many other sites and authors are also increasing their volume of published content. Across the web we are seeing significant content growth, the number of Google indexed pages has grown from 1 trillion to 30 trillion in the last 7 years.

This growth in content is likely to accelerate for many reasons including more companies adopting high volume strategies like the Post, growth in automated content, increased volumes of short form content, lots more video content, increased long tail niche content and simply because there will be more internet users with access to easy to use publishing tools. The data suggests the number of new posts and articles published next year may be double that published this year.

In Cleveland this week Content Marketing World will be exploring the future of content. This is my small contribution, a look at why the future is more not less content and what it means for your strategy and operation.

Isn’t content all about quality not quantity?

I know if you are in content marketing, there is a lot of advice about quality over quantity. Provide something of value, research it well, make it helpful. It is a strategy I have followed at BuzzSumo. I spend a lot of time researching posts, as I did with this one, aiming to produce authoritative, long form content that provides insights which, hopefully, are helpful to marketers. This takes time and I produce around one to two posts a month.

I am now thinking I may have got this all wrong.

Haven’t we reached peak content? No, not by a long way

Last week I read a post that argued the future of content marketing is ‘less content’. The author predicts “content teams will be producing far less content” albeit content that is “far more interesting.” I think whilst it is true that content will take a wider range of forms, including interactive content, the future is not less content but the opposite.

My reasoning is based on a number of factors including the effectiveness of the strategy adopted by the Post and others.

Firstly, the growth in content continues unabated. It is not easy to track the number of new articles being published each year but we can use some proxies, for example, the number of content items being indexed by Google and the number of posts published on WordPress.

As we noted above the number of pages Google has indexed over 7 years from 2008 to 2014 has increased from 1 trillion to 30 trillion.

That is an increase of 29 trillion pages in 7 years. The number of net additional pages indexed by Google each year appears to be increasing, it was 3 trillion in 2010, 5 trillion in 2012 and 8 trillion in 2014.

WordPress publish data on the number of posts published by blogs they host, or blogs that use their Jetpack plugin. This is just a single content platform but it is very popular and it gives us an indication about content growth. The figures from December 2006 to July 2016 are as follows:

Again there appears to be a steady growth in the volume of published content. In July 2016 nearly 70m new posts were published on WordPress. Over 2m each day.

It is easy to say most content is poor and can be ignored but quality content is also increasing. In addition to the high volume publishing of news sites such as the Post, in the area of science there are at least 28,100 active scholarly peer-reviewed journals publishing over 2.5m new scientific papers each year. The volume of content being published has been growing, with 4–5% more publishing scientists each year, and evidence publication growth is accelerating.

There is also increased automation of content creation as I will outline below, at this week’s Content Marketing World there is a session on ways to make your content more automated.

There is no indication from the data that we have reached peak content, in fact, the trends indicate that the volume of published content is increasing. Levels of internet access across the world are still increasing as are literacy rates, this combined with easier creation tools, would suggest we may have some way to go before we hit peak content.

My view is that we could see a doubling in the number of posts published next year. This is simply taking the current growth in published content and understanding how this growth could be accelerated by factors such as:

  • Higher numbers of internet users and growing literacy
  • Falling costs of content production and distribution
  • Easier content production with simple to use tools, particularly video
  • The success of high volume strategies being adopted by sites such as the Post and others, encouraging more businesses to adopt similar strategies
  • A significant growth in automated, algorithm driven, content creation

Let’s start by looking at the high volume content strategy being adopted by the Post.

What is Bezos doing at the Post?

Jeff Bezos is a smart guy, and since he became the owner at the post, their traffic has grown. In the last year from April 2015 to April 2016 their visitors grew 28% and from October to December 2015 they had higher numbers than the New York Times.

I wanted to understand how Bezos and the team has achieved their visitor growth. There appear to be a number of factors including:

  • Data — Bezos employed a group of data scientists to analyze the content that gains traction
  • Technology — The Post developed software called Bandito that allows editors to publish articles with up to five different headlines, photos and story treatments, with an algorithm deciding which one readers find the most engaging.
  • Headlines — Using data and technology the Post has developed more viral headlines, which is arguably different from clickbait if there is substance to the content
  • Video — The Post has increased its use of video
  • Content growth — The Post has adopted a content strategy which involves producing a high volume of content aimed at engaging a long tail of niche interests

I will cover this strategy in more detail in a future post, as content marketers can learn a lot from Bezos and the Post. However, what I want to focus on here is their strategy of increasing the volume of content, and more specifically long tail content.

The long tail theory

For those that are not familiar with the long tail theory it was first codified for me by Chris Anderson, editor of Wired Magazine. In essence the theory sees a shift away from a focus on a relatively small number of “hits” (very popular products) in certain industries and focuses instead on the huge number of niches that exist (the long tail).

Amazon was a classic long tail company that started to cater for the niche markets. The long tail theory goes something like this.

Popular books such as the New York Times best sellers sit in the red area and most book shops historically had a heavy focus on stocking these popular books as they knew they appealed to a broad majority of their audience. In this traditional hit driven business it has been estimated that 80% of revenues and nearly all profits came from the top selling products.

Whilst everyone recognised the potential of the long tail it was very expensive to produce, store and distribute products to these smaller niche audiences.

With the advent of the internet, Amazon lowered the costs of storing and distributing books and offered books to the long tail audience. Whilst the individual demand for a book in the long tail is much lower than for the best sellers, collectively there is a huge demand from people for niche or specialist books. The revenues from the long tail could actually be much larger than the revenue from popular titles.

Back in 2005 Chris Anderson noted how a new company Netflix, was changing the model and how this company at that time was offering a catalogue of DVDs that was much greater, over 8 times greater than the typical Blockbuster store, and how a significant and growing portion of their revenues was from DVDs not available in stores, the long tail.

As the costs of production, storage and distribution fell, particularly with online and digital products, it became economically attractive to provide products for the long tail niche audience, in fact revenue from the long tail became greater than the hits because the tail was very long indeed. Companies like Amazon and Netflix were arguably some of the first long tail companies.

Why is this relevant to content?

Long tail content

It seems to me Bezos is taking the same long tail approach to content at the Post. Of course we all want the big content article that garners millions of views but traffic for thousands of niche articles can collectively add up to a lot more traffic overall.

By creating over 1,000 pieces of content a day you are more likely to cater for demand in the long tail for specific niche content or simply to produce content that engages a wider audience. The Guardian has taken a similar approach and publishes similar volumes of content to the Post. The Huffington Post was reported back in 2013 to be producing over 1,200 articles a day. Sites such as BuzzFeed have also increased their content production, the Atlantic recently reported the following figures:

April 2012 BuzzFeed published 914 posts and 10 videos

April 2016 BuzzFeed published 6,365 posts and 319 videos

That is a significant growth in content publication. The reason appears to be simply that more content on these sites drives more traffic. This appears to be the case in both B2B and B2C. In 2015 Hubspot looked at the data from their own customers and found that both traffic and leads increased with higher content volumes.

The impact of content volume was greater in B2C but it was also significant in B2B.

Increased traffic and leads does not necessarily mean a good return on investment but the cost of content production is falling and distribution costs are very low, enabling more content to be produced. We are now seeing a high volume approach being taken by many established B2B sites and influencers.

This content growth is not a surprise, Doug Kessler predicted in an exponential content increase in 2013.

How many articles can you write about inbound marketing or related topics? 100? 200? 500? Well, Hubspot published over 4,000 in the last 12 months alone.

Does this long tail approach work in B2B? Let’s take Hubspot’s approach and compare it to say Social Media Examiner. Two of the big sites in the marketing industry.

Social Media Examiner are no slouches when it comes to publishing content, they published over 400 posts in the last year and averaged over 3,900 shares per post, which is incredibly high, even discounting the automated shares by bots.

Hubspot by comparison published over 4,000 posts, ten times as many as Social Media Examiner in the same period. Their posts were shared a lot less on average, almost 600 shares a post.

So who has the better content strategy? My instinct until now has been that you are better off being Social Media Examiner than Hubspot. You can provide higher quality, give more promotion to each post, drive higher average shares and traffic; and you get a much better return on your content investment. However, Hubspot’s articles received 2.8m shares in total compared to the 1.8m shares of posts on Social Media Examiner. That is 1m more shares, over 30% more. We don’t have traffic figures for these sites but I would anticipate Hubspot also received similarly higher levels of traffic.

This high volume strategy has also been adopted by influencers such as Neil Patel. Over 800 articles authored by Neil have been published in the last 12 months. That is a lot of content but his articles have averaged 838 shares per article, more than Hubspot has achieved, and in total Neil’s posts have achieved approximately 700,000 shares. That is almost half the shares achieved by a major site such as Social Media Examiner.

A high volume content approach probably only works if you have built an audience with authoritative, quality content first. However, if you are an established brand should you be looking to adopt the high volume strategies that work for the Post, Hubspot and Neil Patel. My instinct is we will see others adopting similar strategies and increasing, not decreasing, their content output.

My previous reservations have centered on how difficult and costly it is to produce quality content on a regular and consistent basis. But it might be that I have been guilty of old school thinking when it comes to content volume and long tail content.

Using robots to write long tail content

The Post announced this summer that it would use robots to write many of its Olympics stories. These posts would still involve human editing but the algorithm created the initial story. It is easy to be critical of using algorithms to write stories but in this way the Post can use human journalists efficiently and cater for the demand for long tail content by niche audiences. In areas like sports we are primarily talking about robots writing short articles that report the score, who scored, the time of scores, the current league position etc. This data can be easily used by algorithms to write short reports, which may be all someone requires. Journalists can write the more in-depth stories and human stories, and leave the short reports to the robots.

In my view this makes sense. I don’t always need a perspective or in-depth article on movements in the financial markets, on new Google announcements or the latest soccer scores. Content writing algorithms are also getting better by the week. Don’t believe me, go and look at the text written by Narrative Science’s algorithms.

At this point in time it might seem far too complex and expensive for sites to create algorithms to create content. However, these costs will fall and algorithms will become commoditised, open sourced and available to everyone. In much the same way as is happening in machine learning. I anticipate there will be significant growth in robot written content, which whilst relatively small, will be a growing proportion of all online content.

The short form content opportunity

My personal approach has been to create well researched, authoritative and long form content. I have felt good about this as the data consistently shows that long form content gets more shares on average. However, statistics can be misleading. When I looked recently at the most shared content published by marketing and IT sites, the data confirmed that on average long form posts achieved more shares. But when I looked in more detail at the 50 most shared posts, 45 of them were short form and under 1,000 words. Thus people are very happy to share short form content and given the pressures on everyone’s time may prefer short form content.

How do we reconcile these two facts, namely that longer form content gets higher average shares but the top most shared content is made up overwhelmingly of short form content? In my view it is because there is simply a lot more, very low quality short form content. Cynically it is easier to produce poor quality short form content than poor quality long form content. This volume of poor quality short form content drags down the average shares for all short form content relative to long form content.

I personally think there is a big opportunity for short form content and I aim to adapt my strategy to focus more on repurposing and republishing short form versions of my research that focus on specific issues. These could be focused around just a single image or chart.

Short form content could also take the form of serialisation. Most of Charles Dickens’s novels started as weekly or monthly instalments. They made the works more accessible and built an audience that eagerly anticipated the next installment through his episodic approach and use of cliff hangers. A serialised set of articles can still become a book or long form content. Short form content doesn’t necessarily mean a reduction in quality. Particularly if you are serialising your long form piece into short form episodes.

If others think in the same way we may see a significant increase in short form content.

Video and audio content

It has never been easier to produce video content and the effectiveness of video means we are likely to see increased video production. In many ways audio and video content can be produced in less time and more efficiently than written content, even with transcripts.

What are the implications of content growth?

Just because the world is going to produce a lot more content doesn’t mean you need to start trebling the volume of the content you produce. However, you do need to reflect on your strategy in a world of ever increasing content.

It is easy to decry and criticise higher volumes of content, particularly as we are likely to see a lot more poor quality content or even Crap as Doug Kessler says. However, as Mark Schaefer, who coined the term ‘content shock’ points out, for many consumers more content is a good thing. So if you are passionate about the Men’s Trampoline (yes, it is an Olympic event) there is more likely to be a series of long tail articles keeping you updated. The Washington Post did actually publish an article on the American who finished 11th in the Men’s Trampoline, a classic long tail post.

Mark Schaefer points out that in a world of content shock you will get less individual attention for your posts. This is simple math at one level, if content production outstrips growth in social sharing, we will see shares per article decrease on average. However, it does not mean content marketing is not a sustainable strategy. It is ultimately about return on investment. If you can lower content production and distribution costs and engage a small audience which converts you can still achieve a return on your investment.

What is clear, however, is you can’t just start writing 1,000 articles a day. Google would probably be very unhappy if you did. You need to build some authority and an audience over time. However, there is a question about the point at which you increase the volume of your content and leverage the brand you have built.

Quality will still matter, even in a world of high volume content. It may not need to be long from but it does need to meet a quality threshold. Brands should produce content that is always worth consuming, albeit it might be consumed by smaller niche audiences.

A key challenge will be lowering the cost of producing content. Is it different writers, better research and creation tools, more content curation, guest bloggers, more short form content right through to automation.

Can you use algorithm written content to satisfy particular niches? For example, it would be a simple task for us to create a weekly article on the most shared fashion content or automotive content written by an algorithm. While this might sound like a race to the bottom, there is an argument that quality for these types of articles is not about the prose or insights, but about the content being timely and relevant for the audience. Relevance wins over quality for this form of content and bots will hit their deadline every time.

In a world of high volume content your amplification strategy will be more important. Producing quality content and hoping people will find it will no longer work, if it ever did. Content promotion via your own teams and influencers, via email to your subscribers, via paid ads and via social will be critical.

The impact on us as readers

It seems to me one of the biggest challenges of content growth is for us as readers or consumers of content. There will be more articles than ever to work through. You will need good filters so you can be aware of key developments and news but not overwhelmed by content. The difficulty as Doug Kessler says is that poor content may increasingly look on the surface like good content, as everyone learns how to write effective headlines.

We will all need to make smarter use of curators, those people that read widely, that keep up-to-date on specific issues and share articles and views with the rest of us. These people may curate on blogs, content hubs or simply share articles via Twitter. No one can read everything, we will need to rely more than ever on others reading and sharing articles. Teams will have to learn how to effectively leverage their members to curate and filter content.

How to filter content — some examples

The following tools and filters are personal to me but I think they provide an indication of what we all need in some form:

Content alerts

There is no way you can read all the posts published today to identify what people are saying about a particular topic or brand. You need to use tools and robots. I use BuzzSumo alerts, for example I have an alert for mentions of BuzzSumo and I get alerted every time we are mentioned in a web article. I also do this for specific topics such as data driven marketing.

Content aggregation and filtering

I like to aggregate all relevant new articles each day into specific briefings that I can skim and decide which articles to explore in more depth. For some briefings I use keywords, for example here is my briefing on Negative Interest Rates. For other briefings I simply pull together content from the experts I respect. Here is my daily briefing from the best entrepreneur, SaaS and StartUp blogs and in this one I have simply added the twitter handles of various SEO gurus to get a daily briefing on what they are sharing.

Team sharing

On Anders Pink I am in a team with my colleagues and they help me filter by upvoting and commenting on articles. This helps me decide what to read each day. I think leveraging your colleagues in this way is something every team needs to do. It is hard to stay informed in a fast moving world but your colleagues, and your social networks, can help. Your colleagues are likely to have a finely tuned antenna for relevant industry and competitor news.

Trending content

I use BuzzSumo’s trending dashboard to see what is trending each day i.e. what is resonating in my industry. Trending content does not mean good content, as was exposed by Facebook’s recent algorithm issues, but it is useful to see what content is resonating.

Final thoughts

Yes, the irony is most people will not have read to this point. The data shows that most people only scroll down through 50% of an article and 55% of you not reading will have left within 15 seconds according to Chartbeat.

I should probably have serialised this post and written a number shorter posts on key aspects such as a chart on content growth, a piece on the Post’s content strategy or an article on long tail content. Maybe those shorter pieces would have kept your attention and motivated you to read the next article.

The post The Future is More Content: Jeff Bezos, Robots and High Volume Publishing appeared first on BuzzSumo.

Advertisements

Designing Richer Legal Software Tools

Internal paradigm shifts are never easy in a corporate setting, but sometimes they are the only way to point the company in a direction that will help its products and services better satisfy customer needs.

“In recent years, we’ve made great strides at building software tools in a more agile manner, but we saw an opportunity to develop products that are richer and more useful to our customers,” explained Kenya Oduor, Ph.D., director of the user experience team at LexisNexis. “This required putting in place a new process to frame product deliverables more effectively around direct value to users.”

Oduor stressed the importance of technology teams building “empathy” for their customers so they can better understand whom they’re designing products for and what those users are trying to accomplish with the products.

“It’s important to define in advance what success looks like for your customer so that every step you take in software design and development is aligned with what that customer values,” she said.

Oduor explained that LexisNexis is now using a “design thinking framework” that brings together professionals from its product management, user experience, and engineering teams, as well as input from customers within various types of organizations. This is an approach that has been slow to emerge in the corporate IT world, but is now gaining traction in the development of business software applications.

“In the past we were focused on faster and more efficient execution in software development, but with this paradigm shift we’re trying to ensure that design thinking complements agile development,” said Oduor. “The key difference is we’re no longer building product features based on a list of requirements that a product manager hands to us. We’re now collaborating with our colleagues to understand our customers’ challenges and needs, then we go out and design tools that meet those needs.”

A recent example involved the development of Lexis DiscoveryIQ, a new enterprise eDiscovery software platform that leverages the power of machine learning to help litigation professions predict relevance of documents in the early stages of the eDiscovery workflow.

“When we met with colleagues and prospective customers to discuss their specific pain points, it was clear that users of the software really needed more transparency into how our product’s ‘brain’ was making relevance predictions,” said Oduor. “We used that input to design the software in a way that allowed users to feel more comfortable with the use of machine learning and the outcomes they obtained from the engine. In fact, we’re continuing to work on ways to increase the transparency in machine learning so that users have more control over how the software is put to work for them and ultimately improves outcomes.”

Oduor advised that it’s essential to have both “bottom-up” and “top-down” buy-in to the design thinking framework in order for the paradigm shift to happen. “After all, it’s important to make sure that your new software development processes fit together in a complimentary fashion, without negatively impacting your existing business operations,” she said.

* * *

This post is by Daryn Teague, who provides support to the litigation software product line based in the LexisNexis Raleigh Technology Center.

Article Source From:
Designing “Richer” Legal Software Tools

Disruptive Technology is Reshaping Legal Market Says ILTACON Keynoter

Whatever made you successful until now could also be the biggest threat to your future, warned the opening keynote speaker this week at ILTACON 2016.

“What’s happening now is the beginning of a new chapter in the tale of transformation,” said Mike Walsh, CEO of Tomorrow, a global consultancy on designing business for the 21st century, which advises leaders on how to thrive in an era of disruptive technological change. “If disruption set the stage for the reinvention of technology, then transformation is the journey we’re being called toward.”

In his keynote address, “Re-Imagining Legal Technology for the 21st Century,” Walsh challenged the ILTACON audience with three big ideas they can take home with them to assess how effectively they are transforming their organizations in light of the way that disruptive technology is reshaping the legal market.

Question #1: “How will the next generation influence the future we know?”

“The young children in our world today will be the first generation partially raised by artificial intelligence,” said Walsh. The implications for the future are widespread, but one of the more sobering realities for the legal community is that tomorrow’s workforce in our industry will be shaped by people for whom revolutionary technology such as Artificial Intelligence and Wearables will just be part of everyday life. “The most cutting-edge technologies today first found mass adoption through their use in toys,” said Walsh. “This is a wake-up call for anyone involved in professional services because we now need to re-imagine, redesign and re-invent our technology platforms. And we don’t have much time.”

Question #2: “How Will a 21st Century Law Firm Differ from a 20th Century One?”

Walsh noted that the most disruptive changes we’re likely to see in the legal industry will be led not by law firms themselves, but by their clients. In preparation for his ILTA keynote address, he interviewed Mary O’Carroll, Google’s head of legal operations. “I learned a lot from Mary about how Google has decided to try to re-invent the legal function,” said Walsh. “One thing I discovered is the power of machine learning to . . . allow internal clients to get answers to many routine transactional queries.” Walsh noted that Google is keenly interested in seeing their legal partners also embrace this movement toward automation of routine legal tasks. “Designing a 21st century law firm means building an organizational operating system that allows your people to thrive,” advised Walsh in his presentation slides. “Hire for agility, build more social workspaces, rethink your communications and use data to hack your culture.”

Question #3: “How Will AI, Algorithms and Automation Impact the Legal Profession?”

“The key challenge for any leader in the 21st century will be re-inventing themselves to manage in an environment of AI, automation and real-time data,” said Walsh. “Embracing the future means challenging everything we know to be true.” Walsh pointed to the example of DoNotPay, an AI “chatbot” that disputes parking tickets, which has overturned 160,000 fines in just a few months. “None of this is particularly good news for members of the legal profession because it’s showing how transformative the application of simple technology may be,” he said. But he encouraged ILTACON attendees to think less about whether algorithms and automation will replace lawyers and rather what a 21st century lawyer ought to look like. For example, explore how we might use real-time data and visualization tools, observe how new sources of data can create value in the future, and think about changes to clients’ decision-making process.

ILTACON attendees agreed that the new era of disruptive technologies in the legal space creates an opportunity for the creation of agile, innovative eDiscovery software tools that put litigation teams in a position to thrive. One example is Lexis DiscoveryIQ, a fully integrated enterprise eDiscovery software platform developed by LexisNexis and enhanced by Brainspace Corp. This new technology disrupts the traditional linear review model, from PreDiscovery to production, by rethinking the role of early case assessment and providing powerful analytics throughout the eDiscovery process.

ILTACON is a four-day educational conference that draws on the experience and success of professionals employing ever-changing technology within law firms and legal departments. Next week, we’ll recap highlights from one of the sessions that generated buzz about how future technology will affect litigation support functions.

* * *

This post is by Daryn Teague, who provides support to the litigation software product line based in the LexisNexis Raleigh Technology Center.

Article Source From:
Disruptive Technology is Reshaping Legal Market, Says ILTACON Keynoter

ILTA/InsideLegal Study Indicates Legal Tech Budgets on the Rise

A new study conducted by the International Legal Technology Association (ILTA) in conjunction with InsideLegal suggests a majority of law firms, 53%, report an increase to their technology budgets in 2016. In spite of this, survey respondents cite issues of security management, user adoption, risk management/compliance and email management as their “Top IT challenges in 2016.”

The 2016 ILTA/InsideLegal Technology Purchasing Survey, now in its 11th edition, debuted today during the ILTACON Annual Conference in National Harbor, Maryland.

According to the study, the across-the-board increase to the technology budget represents a 12% growth rate compared to the year prior. The survey shows respondents in the small to mid-size law firm range report the biggest increases to their legal tech budgets in 2016, including:

  • 39% of Small Firms (Firms with 1–49 attorneys)
  • 38% of Medium Firms (Firms with 50–199 attorneys)
  • 23% of Large Firms (Firms with 200 or more attorneys)

The survey indicates a majority of firms plan to spend the bulk of their legal technology budgets in the following key areas:

  • 61%, Desktop Hardware/PCs
  • 59%, Laptops/Notebooks
  • 53%, Network Upgrades/Servers
  • 44%, Printers/Multifunctional Devices
  • 44% Antivirus/Antispam/Spyware Software

Not surprisingly as security continues to be top of mind for firms of all sizes, the survey reveals three new areas added to the “master list of purchases,” including:

  • 27%, Security Awareness Training Services, Software and Content
  • 27%, Security Monitoring Services for the Network
  • 3%, Artificial Intelligence Technology

Of note, nearly 20 percent of respondents reported investing in analytical software within the last 12 months, representing an 11% increase from the year prior. Additionally, the study shows a quarter of firms plan to upgrade their cloud storage in the coming year, versus just 15% in the year prior.

In terms of who is requesting the purchase of technology for the firm, attorneys are at the top of the list with 72% making the request. Followed by:

  • 42% Law Firm Administration
  • 23% Litigation Department/Practice Support
  • 23% Marketing
  • 19% Staff

In the study, Head of Content & Legal Market Strategy for InsideLegal and survey co-contributor, Jobst Elster, stressed the importance of using the study findings to better understand where the legal tech business is headed:

“When we first issued this survey back in 2006, our mission was clear, to provide legal technology companies, in particular those invested in ILTA, with details on ILTA member firm budgeting, technology purchases (actual and planned), technology purchasing influences and details on legal technology trends and legal IT challenges. Over time, the survey has evolved into a ‘firm-forward’ resource with budgeting, technology purchasing, trending data, and service provider satisfaction data firms themselves increasingly use for benchmarking and as an important data point of their competitive intelligence strategy.

My point … inhale this survey, make time to read it, analysis it and discuss it with your peers because chances are your customers and prospects have already done so.”

For more ILTACON 2016 updates visit: http://insidelegal.typepad.com/files/. In addition for those attending this year’s ILTACON Conference, be sure to visit the LexisNexis at booth #217 for product-related demos and updates.

***

This post is by Carla Del Bove, who provides support to the business of law software product line based in the LexisNexis Raleigh Technology Center.

Article Source From:
ILTA/InsideLegal Study Indicates Legal Tech Budgets on the Rise

In Their Own Words Customers Share CounselLink Business Benefits

They say every good relationship begins with open communication. This is especially true when it comes to a managing a customer relationship. At CounselLink, our customer philosophy is more than simply a vendor/customer relationship. In contrast, we view these relationships as much more of a long-term business partnership.

This means soliciting regular feedback from customers to answer the age old question — how well are we doing?

In other words, are we listening to the needs of our customers and addressing their daily challenges with real, tangible solutions? Are we helping our customers control outside legal spend, drive better outcomes and communicate more efficiently with their internal and external legal constituents? And, most importantly, would our customers recommend CounselLink to their colleagues and peers?

While we strive for the answer to all of the questions to be a resounding yes, we know that we are only as good as our customers believe we are. So, rather than take our word for it, here’s what some of our customers have to say about the CounselLink, in their own words:

Timothy Donovan, General Counsel, Caesar’s Entertainment Corporation Las Vegas:

“When I started at Caesar’s Entertainment seven years ago there was no e-billing system. There was little data that was available to me, so CounselLink, gave me the tools to be able to capture that information, utilize it in connection with the managing outside counsel to have outside counsel use it for setting budgets. And, for my lawyers, to be able to manage as best they could, against those budgets because they were put into the CounselLink system.”

Erin Gray, Deputy General Counsel, CNL Financial Group:

“CounselLink has solved two distinct challenges that we’ve experienced at my firm. One, with legal spend, we are actually now not only able to speak to what our annual legal spend is, but we are able to distill that data into various matter types, by law firm and even by in-house counsel that’s responsible for overseeing the matter. Secondly, on the matter management side, with tracking the length of time it takes to get from beginning to end of a matter.”

Megan Conkrite, Contract Specialist, Hospital Sisters Health System (HSHS):

“We have hospitals spread out over two states and four different divisions and it’s a great way for our Office of the General Counsel to be able to connect with each other.”

Feedback such as this, continues to drive us to listen and learn from our customers, so we can become better business partners to them. Our goal centers on one mission — helping our customers do more with less. We believe this can be achieved when you have the right tools, along with the right process and people in the mix.

We’re happy to see that our customers tend to agree.

***

This is a post by Stephen Fisher, Manager of Corporate Legal Account Management for the LexisNexis CounselLink business.

Article Source From:
In Their Own Words, Customers Share CounselLink Business Benefits

New Federal Directive Clarifies U.S. Cyber Incident Coordination

Last week’s news reports that hackers had breached the New York Times was the latest reminder of the existential threat that all private-sector businesses face from cyber attacks. In the legal community, cyber incidents have taken place at nearly all major U.S. law firms and cybersecurity challenges are increasingly affecting small and midsized law firms as well.

Earlier this month, the Obama Administration’s National Cyber Investigative Joint Task Force released a new cybersecurity alert that “establishes a unified federal government response to potential cyber incidents,” according to the ABA Cybersecurity Legal Task Force.

“This alert provides an excellent fact sheet for when, what and how to report to federal agencies in the event of a cyber incident,” said Jeff Norris, CISSP, senior director of data security for LexisNexis Managed Technology Services. “If law firms are obligated by law or contract to report an incident, they should comply with that obligation as noted in the federal directive. If voluntarily reporting, the alert provides a useful list of relevant federal agencies and their specific points of contact.”

The “Presidential Policy Directive on U.S. Cyber Incident Coordination” provides clarity on the cross-agency federal response posture to private-sector cyber attacks. Excerpts from the directive include the following:

When to Report: “victims are encouraged to report all cyber incidents that may result in a significant loss of data, system availability, or control of systems; impact a large number of victims; indicate unauthorized access to, or malicious software present on, critical information technology systems; affect critical infrastructure or core government functions; or impact national security, economic security, or public health and safety.”

What to Report: “A cyber incident may be reported at various stages, even when complete information may not be available. Helpful information could include who you are, who experienced the incident, what sort of incident occurred, how and when the incident was initially detected, what response actions have already been taken, and who has been notified.”

How to Report: “Private sector entities experiencing cyber incidents are encouraged to report a cyber incident to the local field offices of federal law enforcement agencies . . . The federal agency receiving the initial report will coordinate with other relevant federal stakeholders in responding to the incident. If the affected entity is obligated by law or contract to report a cyber incident, the entity should comply with that obligation, in addition to voluntarily reporting the incident to an appropriate federal point of contact.”

“This alert serves as an important reminder to law firms that they should have an Incident Response plan that addresses how they will respond to a cyber incident,” said Norris. “This should be done in conjunction with the partners, the communications team and the business leaders to understand specific reporting steps you will take and any communications you will generate, both internally and externally. Law firms are good at serving as external counsel to companies on how they should handle data breaches, so it’s important to take their own advice when it comes to their incident response planning.”

* * *

This post is by Daryn Teague, who provides support to the litigation software product line based in the LexisNexis Raleigh Technology Center.

Article Source From:
New Federal Directive Clarifies U.S. Cyber Incident Coordination