My experience trying to outsource myself

Project    Description    Winning Proposal    Conclusion |||    Using my data


Like everyone else, I have practical goals, like paying the rent. So I registered my professional self, Dr. Claudia Krenz, at guru.com as ID #467830 (datafriend @ gmail-.-com).
I knew I needed to stay organized to make system guru work for me--my goal being to make money, not submit proposals for the joy of writing--and, being the trained methodologist I am, I knew that meant collecting data (keeping my browser windows organized was simple, because icab is my browser).

I posted a small project on 08092004 (one line of PERL).

I stated that my budget was a pathetic $10. I am sensitive to the contemporary hostile outsourcing external reality situation some programmers are experiencing, because I have a Ph.D., am part of "the glut" (we're a phenomena of the western world, called by some "the quiet crisis"). Freelancing is the only option. I'm sure the programmers have their own networks: in the 1990s the "Young Scientists Network" took on luminaries like Carl Sagan for saying the country needed more scientists (given that the ones it had couldn't find jobs).

Not surprisingly, one of the first things I did after posting the project was to retract most of what I'd said; I posted on system guru's semi-public qb board:

To those who commented: yes, a) it appeared a *simple* task--but I've forgotten how to do simple tasks like that. b) I didn't think of XML, because I thought it was just for creating "data barns" ("data silos" not being large enough to hold all the constants being collected in the 21st century). Also, this is just for personal use, something I want to do to check myself on my own HDD. d) I had been thinking of a single line--but a few lines of commands stored in a file works just as well. e) I'll use the quant stuff to compute new variables. f) Although, I hadn't mentioned it, yes, of course, I will leave positive feedback for whoever answers my question.

I closed it early because of fatal errors:

System guru began experiencing "ISP problems"--so said its email--about 3 days after I posted the original project. I closed it a day early without awarding anyone: No system is perfect, I shrugged my shoulders and reposted the project. System guru thanked me for helping them debug their system (I'm always glad to help out but was astonished since I'm not a programmer).

I reposted the project as "one line of PERL redux" immediately afterwards, as a by-invitation only project, with all those who had submitted bids to the original project post as the invitees. Both were set up as 7-day posts. Both went through system guru's in-house review, whatever that means.

Just before closing the first project I discovered a fatal error of my own (I had been using my raw HTML file (gurusubmits.html) as input--and its special characters were I'm sure what had made PERL puke). I apologized for my contribution to the chaos (also mentioning I thought it would've been helpful if I'd pasted the submitted code into the the private db pop-ups as different ones of us conversed). I wrote that I knew I must first create a new input file by select all/copying *from* my browser window (after it had read the file gurusubmits.html). I call this input file "sonofgurusubmits."

project description

the winning proposal


Bid for project: $10.
Rate for project: $125 per hour.
Proposal for project: Okay, everybody seems to be having trouble with special characters, so let us solve this now:
perl -e 'eval pack("H*", "756e64656620242f3b666f726561636820283c3e203d7e206d2f5e287469746c653a2e2a3f2946524f4d3a2e2a3f2f67736d297b732f3b2f5c6e2f673b732f285c732a544f3a292f5c6e24312f673b7072696e742022245f5c6e5c6e223b7d")'   <sonofgurusubmits> out.txt

Naturally, my heart thrilled at "let us solve this now" (when using the system guru private db system, everything between < and > was stripped out, which made it impossible to quote the submitted code, part of my humble effort to keep organized and efficient). A workaround some used for this was to "attach" a file containing the code constituting their proposal; that solved the problem of stripped characters but simultaneously created one of attribution (perhaps adding a #comment like name would assist clarity). There were several submissions that worked--amazing what PERL can do if you don't make it puke--I'll include them when I update this page. For lack of a better decision rule, I awarded the project to the first person submitting code that worked on my HDD.

The submitted code--although I might as well be throwing pixie dust over my shoulders for all I understand--gives me exactly what I wanted, a means of tracking my work--exactly like shown in the 2nd column of the table above--it is now very easy for me to track all kinds of data.

completing the project



post project fun with real-time data

Thanks to this project, I can now obtain real-time data reflecting my actual use of system guru. On 082804, I applied the pixie dust--after, of course, creating the kind of input file described above (I named it sunofgurusubmits828).
cloud% perl -e 'eval pack("H*", "756e64656620242f3b666f726561636820283c3e203d7e206d2f5e287469746c653a2e2a3f2946524f4d3a2e2a3f2f67736d297b732f3b2f5c6e2f673b732f285c732a544f3a292f5c6e24312f673b7072696e742022245f5c6e5c6e223b7d")' <sunofgurusubmits828 > out828.txt
cloud% grep -c DATE: sunofgurusubmits828
83
cloud% grep -c DATE: out828.txt
83

What does this tell me? That I've submitted proposals to 83 projects (no wonder I don't have time for TV--I'm consoled by Lord Kelvin's earlier words: "if you can't quantify, your knowledge is meek and humble") .... Contrast what's now on my hdd with the many uninformative constants being collected online these days--at great cost to us all and of no benefit to anyone: Instead of useless constants, *I* have *real* variables, lovely data like # projects applied for, # that I applied for that closed: gorgeous interval-level data (none of that crummy so-so science stuff, ordinal or worse, nominal categorical data-- quasi-variables like SES). While lacking the general relevance of the U.S. Geological Survey's Mount St. Helen's seismic measures, my data are every bit as good as USGS's (!!): both equal interval scales of measurement with--get this--meaningful, interpretable 0s and so ratio data, the best of the best--the only difference being that the USGS doesn't call anything "pixie dust." Heady thoughts indeed.

What I next want to examine is whether there's a difference-- in closing projects w/o awarding anyone--between projects posted by those with a rating and those without. My data cannot, of course, address reasons why projects close.

cloud% grep Q out828.txt | awk '{print NR, $0}' > QYQN

cloud% grep Q -c out828.txt
42

cloud% grep Q -c QYQN
42

About half of the projects to which I've applied--42 to be exact--have closed (at this particular point in time). I want to find out more about projects closing without awardees.

cloud% grep QY -c QYQN #projects closed by employers w/ a rating
7

cloud% grep QN -c QYQN #projects closed by employers w/o a rating
35

35 + 7 = 42, the total number of closed projects.

Here are the details about my closed projects, first for those w/o a rating and second those with.
cloud% grep QY QYQN
1  QYwhen closed: 0awards, 15applicants
4  QYwhen closed: 1awards, 5applicants
6  QYwhen closed: 0awards, 33applicants
7  QYwhen closed: 1awards, 18applicants
8  QYwhen closed: 1awards, 14applicants
10  QYwhen closed: 0awards, 8applicants
13  QYwhen closed: 0awards, 9applicants
grep QY QYQN | grep 0awards -c
4
 cloud% grep QN  QYQN
2  QNwhen closed: 0awards, 19applicants
3  QNwhen closed: 0awards, 24applicants
5  QNwhen closed: 1awards, 42applicants
9  QNwhen closed: 1awards, 30applicants
11  QNwhen closed: 0awards, 11applicants
12  QNwhen closed: 0awards, 8applicants
14  QNwhen closed: 0awards, 12applicants
15  QNwhen closed: 0awards, 11applicants
16  QNwhen closed: 1awards, 10applicants
17  QNwhen closed: 0awards, 10applicants
18  QNwhen closed: 0awards, 22applicants
19  QNwhen closed: 0awards, 1applicants
20  QNwhen closed: 0awards, 5applicants
21  QNwhen closed: 0awards, 12applicants
22  QNwhen closed: 0awards, 37applicants
23  QNwhen closed: 0awards, 21applicants
24  QNwhen closed: 0awards, 17applicants
25  QNwhen closed: 0awards, 1applicants
26  QNwhen closed: 1awards, 22applicants
27  QNwhen closed: 1awards, 22applicants
28  QNwhen closed: 0awards, 9applicants
29  QNwhen closed: 0awards, 3applicants
30  QNwhen closed: 0awards, 18applicants
31  QNwhen closed: 0awards, 11applicants
32  QNwhen closed: 0awards, 24applicants
33  QNwhen closed: 0awards, 18applicants
34  QNwhen closed: 0awards,   4applicants
35  QNwhen closed: 0awards, 12applicants
36  QNwhen closed: 1awards, 2applicants       
37  QNwhen closed: 0awards, 49applicants
38  QNwhen closed: 0awards, 21applicants
39  QNwhen closed: 0awards, 10applicants
40  QNwhen closed: 1awards , 5applicants
41  QNwhen closed: 0awards, 8applicants
42  QNwhen closed: 0awards, 21applicants
grep QN QYQN | grep 0awards -c
28

I cross-tabulate these data below, with columns indicating whether the project had been posted by an employer w/ a rating (or not) and rows, whether the project had been awarded to someone (or not). The left column displays what I did to fill in the table cells (the ones obtained by addition or subtraction are self-evident).
Project Awarded
to anyone
when closed?
Rated
employers
Unrated
employers
Yes#derived by subtraction#derived by subtraction
Nogrep QY QYQN |
grep 0awards -c
grep QN QYQN |
grep 0awards -c
Total grep QY -c QYQNgrep QN -c QYQN
Project Awarded
to anyone
when closed?
Rated
employers
Unrated
employers
Total
Yes03 07 10
No04 28 32
Total 0735 42

Of those 42 closed projects, looking at the row totals, 32--roughly 3/4--were closed without the project having been awarded to anyone. Of those 42, looking at the column totals, 7 were closed by rated employers and 35 from, like myself, new comers. The latter difference is not surprising, because most of the projects are posted by employers without a rating.

Looking now at the table cells: employers with a rating closed about half their posted projects without making an award--and those w/o a rating closed about 3/4 of their projects without awarding them to anyone.

Note that these data merely represent my use of system guru and so aren't generalizeable: we'd need more data to establish that. Given the number of projects closed w/o awards would make me wary of using any univariate statistic, e.g., measuring growth solely by increases in the "# of posted projects" (it doesn't cost anything to post a project). These usage data are cross-sectional, at a given point in time. In that context, they are instructive--I was very grateful for one line of PERL, delighted with my current data, and eagerly anticipated collecting more, regularly, which I did. Oh, yeah, one thing more: my data didn't help pay the rent.