5/7/09

Crowdsourcing – The Live Experiment

After Greg’s post on crowdsourcing, he and I met to explore how we might push this envelope a little further. We decided to run an experiment to see how well crowdsourcing might work for a firm. We wanted to show immediate value and cost savings and to demonstrate which types of tasks might be handled this way. The Setup Tool: We (mostly Greg) decided to use MTurk from Amazon as the tool. After some research on price ranges, we chose to pay our workers 10 cents per task. That appears to be an average price based on what we saw and it’s cheap ... like Greg. Project: We know law firms love to have information about General Counsels (GCs), so we selected that as our task. Greg was able to pull a list of companies from two different markets. 50 from the mid-west and 50 from the SF Bay area. These companies are listed on the site for the project. Then we asked for the following from each company – GC first name, last name and a link to their online bio. The bio had to be from the company site. We let them know the GC may have another title like Chief Legal Officer or Corporate Secretary. The Method: To test for and insure quality we did two things. First Greg ran a report from another system so we already know the answers to our question. Second, we allowed for two people to respond for each company GC. This ‘double-blind’ approach would serve as a quality check of the information we obtained. Our budget for the experiment - $22.00. $20.00 for the task payments and $2.00 for the MTurk fee. Initial Response: Within the first hour we had 20% of our responses in at a cost of $4.24 per hour. The quality seems to be high, but a full analysis will come once we close the project. Obviously the crowdsourcing approach will have limitations as to the types of tasks and information we collect. But our initial assessment is that this idea has merit. It appears to have hit Greg’s trifecta: Cheap, Easy and Fast. More to come …

Bookmark and Share

4 comments:

Ron Friedmann said...

Great idea and experiment. Have you considered adding recognition to the monetary award? That would take advantage of competitive juices and mean a lower dollar budget.

Next step: ask for help on not-so-easy to find info? Maybe a good contact at a prospect or a precedent / template for an active matter.

Related note... I've blogged about predictive markets, which is another form of crowd sourcing. Views on that?

Greg Lambert said...

Ron,

It is obvious that this is not your first Crowdsourcing interaction. Those are some great ideas.

With MTurk, you don't know who the people are that are picking up the "HITs", so I'm not sure if you can do any type of recognition of the MTurk workers. I'll check on that, though.

Anonymity works both ways, too. They don't know who I am either. That can be a good or bad thing depending upon how you want to interact with the crowd.

The question we gave looks pretty simple on the surface, but in reality there were probably about 20% of the questions that we asked that really didn't have an answer that the worker could point to. (More on that in tomorrow's blog).

I've been really happy with all of the information that we are finding during this test, and there are a lot of things that we didn't think about when we started this that are now becoming obvious to us. (Again, more on that later.)

John Craske said...

Great ideas - looking forward to see how this develops.

Legal process outsourcing said...

Thanks for this great and informative post...
Regards,

LPO

 

© 2014, All Rights Reserved.