You have an annual income target of $. Each month you have time to do at most pitches , a number which is affected if you're busy on a gig. The time investment for each pitch is between and hours , all in. The chance of getting a gig is about %, which will pay at least $ with a diminishing probability of paying considerably more. Gigs typically begin between and weeks hence , and will require your full attention for % (2 days) of the weeks following that. You typically get paid within to weeks after completion.

When you are ready to run simulations with these parameters, hit

You met your target X% of the time (with a median of $M), from an average of Y gigs an average of Z days apart. Here are some typical runs, and you can from the pool of results:

Here are some atypical runs, which you can as well:

How are we defining typicality?
How did I determine typicality? I measured a bunch of features from the simulation runs, took the top two principal components, found the centroid, measured the distance of each point from it, and declared the closest 80% to be typical.

Here are some histograms showing the aggregated contours of interesting dimensions in the simulation runs:

Total income: the absolute number of dollars from gigs pitched within the one-year period.
Cash flow: the dollars received during the one-year period.
Receivables: the dollars due after the one-year period has elapsed.
Hours spent pitching.
Days on the job.