Cash Transfer Equivalency Calculator

Closing my company has given me the time to pursue a number of small projects. One of those projects is a concept I wrote about last month called the Cash Transfer Equivalency (CTE). The CTE is a simple investment standard that a program officer or social investor can use to assess whether a social program might deliver more value than simply giving equal amounts of cash away.

To make the CTE easier to use, I wrote a web-based CTE calculator that allows users to enter a program’s cost, the number of people the program intends to serve, and the estimated value of that service to each of the intended beneficiaries. Based on those inputs, the CTE calculator estimates whether the proposed social intervention will provide more value than simply giving money away.

The CTE calculator is an easy use to initial assessment tool for grant making insitituions evaluating new grant opportunities. Importantly, because the CTE translates social value into monetary terms, one could use the CTE calculator to compare two or more unlike funding opportunities.


There isn’t much to the CTE calculator, so if you are so inclined you can skip this quick tutorial and give it a try now. But for clarities sake, let’s run through the following example.

Let’s say we are approached to fund a youth focused musical enrichment event. The potential grantee is requesting $7,500 to hold a one day concert for low-income kids. Our first step in the CTE calculator is to enter the program cost, in this case $7,500.


The youth concert expects 200 kids to attend. In step two, we enter the expected number of people affected by the program as 200.


You’ll notice in step two we didn’t just enter the number of people, but we also need to answer the “average value” column. The “average value” is our best guess as to how much each kid would have been willing to pay to attend the concert, were the program not being provided free of charge. In this case, we put in an estimate of $35 per person.

With those three simple answers, the calculator calculates the CTE and suggests whether the program is worth investing in.


With our youth concert example, the system calculates a CTE of 0.93. Because the CTE is below 1 (the point of indifference between doing the program and giving away equal amounts of cash) the calculator determines that the program is not worth investing in.

More simply, the basic mechanics of the CTE are exposed in the average cost versus the expected average value. At a program cost of $7,500 with 200 concert attendees, the average cost per person is $37.50. However, the expected average value we entered was just $35 per youth. Therefore, the program cost more on average than the value we expect each youth to receive.

This is a pretty straightforward example. Where the CTE calculator gets more interesting is when a program targets more than one recipient group. Using the youth concert example, you could imagine not just calculating the return to the kids, but perhaps their parents as well. The calculator allows you to enter any number of target groups, calculating the CTE for each group as well as a weighted average CTE across groups.

Using the CTE calculator

I wish I had the CTE calculator when I was working in a financial intermediary making grants to community development corporations in Pittsburgh. The calculator would have allowed me to more quickly weed out bad investments, and more importantly would have provided a much needed standard method for preliminarily assessing the high volume of incoming grant requests.

If I was still working as a grantmaker I would make the CTE a part of our initial grant application assessment. Each application would be assigned a CTE score by an individual program officer. The grants with the highest CTE scores would then go to committee for further consideration.

Because the CTE score hinges on the assumed monetary value per program recipient, the investment committee would likely debate the value assumption in the model. This is a good thing and illustrates the CTE method’s strength. Because the CTE score is driven as much by our best guess of the monetary value to the beneficiaries as it is a function of cost, the CTE focuses investment committees to have frank discussions about the value they believe their grant making will create.

You can check out the CTE calculator here and use it as you’d like.

Nonprofit consultants, beware of window shoppers

I’m no fan of nonprofit consultants, despite being one myself. But nonprofit consultants are people too, although we’re not always treated as such by the organizations we serve.

As knowledge workers, what we know is what we sell. Yet the courting process for securing work (multiple meetings, requests for proposal, etc), requires that we disclose methodologies to potential customers.

I get that outlining approaches to potential customers is a necessary part of the process. It allows both parties to determine whether the consultant is a good fit. But every consultant has stories of laying out a methodology, entertaining a number of questions from excited sounding staff and board members, all to have those same ideas implemented by another vendor or the organization’s staff.

No hire, no attribution, nothing.

This is a pretty messed up approach, and if you are a non-profit consultant looky-loo (you know who you are), please stop.

I’m not perfect at avoiding nonprofit consulting window-shoppers, but with some experience under my belt I’ve certainly gotten better at avoiding these organizations. Here are a few tips to avoid being a victim of thought theft.

  1. Qualify customers – Before filling out a request for proposal (RFP) or agreeing to meetings, look up an organization’s 990 on Guidestar and check out their annual revenue. If revenue is tight and the proposed scope looks to be outside their budget, you might have a window shopper on your hands.
  2. Be wary of unsolicited requests for proposals – Organizations are typically required to get more than one bid for a project, even if they have a preferred vendor in mind. I’ve certainly had some luck with organizations sending me RFPs out of the blue, but I’m generally wary of these “opportunities”, as they tend to be fishing expeditions for pre-selected vendors.
  3. Be judicious with your time – Window shoppers have a nasty habit of setting up multiple meetings, wasting your time while sucking you dry of your hard fought good ideas. Value your time. If you don’t, they won’t. And if a nonprofit is asking for too much face time without any commitment, it might be time to walk.
  4. Ask around – Ask other consultants about nonprofits you are thinking of working with. I’ve avoided some bad contracts by tapping my network.

My tendency, like other (good) nonprofit consultants is to be helpful. I love geeking out on all things social sector. While the nonprofit sector is accustomed to receiving pro-bono help, manipulating nonprofit consultants looking for work into offering up their ideas for nothing is contrary to the principles of our do-gooding industry.

How social proof subjugates program evaluation

About a year and a half ago, The Verge wrote an incredible exposé on the seedy underworld of get-rich-quick fake bossiness gurus, who prey on hapless victims down on their luck and in need of cash.

The basic scam is to sell a wide range of “products” to people aspiring to startup one-person businesses. Each of these products is basically a PDF document full of shallow advice that recommend further products in the series to achieve success.

To the discerning eye, it’s not terribly difficult to spot business self-help website nonsense. They all basically look like some derivation of this.

At the heart of the online marketing underwold is the concept of “social proof”. These business guru scammers collude to make it appear as though they are experts in business, and insanely wealthy. They do so by linking to each others’ websites to manipulate search engine rankings, and quoting one another on their respective websites, giving the illusion that each of these individuals is endorsed by other experts.

I’ve been sitting on this topic for quite some time, always thinking back to the concept of social proof when any new social sector “break through” initiative is touted loudly in the media without a shred of evidence that said intervention actually works.

Who needs evaluation when you have publicity?

Good stories trump good data in the media, and questionable ideas that sound plausible are shrouded in social proof and promoted as though they were ideas worth spreading.

For those of us in the social sector, it’s (generally) easy to spot initiatives with exaggerated claims of success. But for the casual donor and untrained eye, the origin of enormous amounts of support to philanthropic causes, the difference between real outcomes and social proof can be illusive.

I’m not sure how one might go about tackling this problem. There are plenty of nonprofits that try to be honest about their results for internal improvement, and to a lesser extent are transparent with their donors about their findings.

But the incentive is always there to manufacture positive publicity by promoting misleading claims of impact. And more importantly, getting other nonprofits, coalitions, businesses, politicians, and media outlets to repeat those claims, thus creating truth.

The social proof versus program evaluation conundrum is a non-trivial puzzle. Donor education programs are more likely to appeal to savvy donors in the first place, so donor education is at least a difficult path, if not a non-starter.

For nonprofits, favoring actual proof over social proof is a poison pill, as high flying headlines and endorsements by public figures in major publications will always trump more down-to-earth claims of impact.

It’s an interesting question without a clear answer. The cost of not figuring it out is donor capital flowing to compelling sounding claims, rather than actual results.