Presented during the 2012 ACM Conference on Computer Supported Cooperative Work (CSCW 2012), the following original research on “the value of microblog content” is shared for your review and consideration.
Prepared by Paul Andre (Carnegie Mellon University), Michael Bernstein (MIT) and Kurt Luther (Georgia Institute of Technology), “Who Gives a Tweet: Evaluating Microblog Content Value” offers quantifiable insight into the perceived value of Twitter “tweets” through the lens of content, context and evolving social norms.
Primary questions considered as part of this milestone study of over 43,000 volunteer ratings on Twitter include:
“Conventional wisdom exists around these questions, but to our knowledge this is the first work to rigorously examine whether the commonly held truths are accurate. Further, by collecting many ratings, we are able to quantify effect sizes. A better understanding of content value will allow us to improve the overall experience of microblogging.” (Study Authors)
Predictors of “tweet” value in this study were based on “worth reading”"neutral” or “not worth reading” ratings of individual tweets from eight specific categories that included.
Additionally, the reasons for determining whether a tweet was “liked” or “disliked” ranged from:
Reasons for Liking
Reasons for Disliking
PDF Version of Study: Click here.
Source: “Who Gives a Tweet: Evaluating Microblog Content Value” – Paul Andre (Carnegie Mellon University), Michael Bernstein (MIT) and Kurt Luther (Georgia Institute of Technology) – as prepared for CSCW’12, February 11–15, 2012, Seattle, Washington.
This entry was posted on Tuesday, February 21st, 2012 at 1:49 pm. It is filed under Blog Slider, chronology, discover, Live Feed and tagged with research, social media. You can follow any responses to this entry through the RSS 2.0 feed.
Comments are closed.
Providing timely articles, expert insight, and industry research, the Weekly eDiscovery News Update is the trusted source for relevant eDiscovery, corporate risk and vendor news and views for legal and technology professionals. Sign up today.
Taken from a combination of public market sizing estimations as shared in leading electronic discovery reports, publications and posts over time, the following eDiscovery Market Size Mashup shares general worldwide market sizing considerations for both the software and service areas of the electronic discovery market for the years between 2012 and 2017.
The gathering and use of information to help achieve personal and professional objectives has been a task executed by individuals and organizations from the beginning of time. However, with the advent of tools and technologies that can greatly accelerate this gathering and use of information, it is increasingly important that one considers not only the positive things that can be accomplished from the greater understanding derived from increased information access, but also considers the potential dark side usage of this increased information access.
Just as there are many tasks in electronic discovery, many times there are multiple technologies and platforms involved in the complete electronic discovery process. When there are multiple technologies and platforms involved, data must be transferred from disparate technologies and/or platforms to other disparate technologies and/or platforms. This data transfer can be considered a risk factor that affects the overall electronic discovery process.
In today’s “sound-bite” environment in which professional organizations compete for client attention through a variety of conduits and communications, it is increasingly important for marketing and sales leaders to consider and coordinate the use of all communications and communications tools in order to maximize impact and influence on potential clients.
Beginning in early 2012 the topic of Technology-Assisted Review moved from expert-led explanations to mainstream mentions in legal community articles, opinions, surveys and reports. Provided for your research, review and consideration are a compilation of key headlines and links from online sources on the topic of Technology-Assisted Review from February, 2012, until now.
Updated: 9/16/2013 – Provided for your consideration and use are the in-progress results of the One-Question Provider Implementation Survey launched by ComplexDiscovery on 3/3/13. The results consist of survey answers harvested directly from the online survey form as completed by provider representatives.
Updated 7/23/2013: Provided for your consideration and use are the in-progress results of the Predictive Coding and Provider Survey launched by ComplexDiscovery on 2/10/13. The in-progress results consist of survey answers harvested directly from the online survey form as completed by provider representatives.
Based on a website review of leading providers in the electronic discovery arena, the following list provides a quick, non-all inclusive reference of firms that appear to have developed “technology assisted review” technology (one form of this being “predictive coding”) for their own and/or partner offerings.