Earlier in the year we began the search for a church management software vendor. I’ve been meaning to share the process we went through, as well as our results, but haven’t had the chance to till now. We’re in the midst of a large software project, replacing all our core business software with more suitable applications. One of the needs we recognised early on was for our pastors and ministry staff to have a tool that was truly well suited to their particular needs. The process we went through is equally applicable to most kinds of software, not just ChMS.
We’re using The Raiser’s Edge for all of our larger constituent management, but we felt we owed it to the pastors to get them a best-in-breed tool, which we’d then integrate as necessary with Raiser’s.
Update: A sanitized version of our Needs Assessment is included here. It will NOT match your needs exactly, but feel free to use it as a template. Please don’t distribute it online yourself.
I spent some time looking online, and a number of blogs were hugely helpful in our early research phase, especially that of Joel Lingenfelter. After a few weeks, we settled on the following products to examine:
- Shelby Arena
- Church Community Builder
- ThinkMinistry MinistryPlatform
There are maybe 70 vendors in this space, but the above 5 felt the most robust and headed in the direction we were going. We ruled out ACS as we found them pretty much irrelevant for a modern, web-powered ministry such as ourselves. We also ruled out TheCity, as it didn’t appear to have the right focus for us.
Defining Our Needs
The most important factor to choosing the right product, is knowing what your needs are. We’d made this mistake in the past, so I was determined not to do it again. Over a period of a couple of months, my team and I spent time with many of our pastors, and one particular pastor in particular. We noted down all of the things they felt that software could help them accomplish, and together we fleshed these out into detailed use cases and requirement lists. We passed these back to the pastors for review, and eventually ended up with 30 pages of “Needs Assessment”, clearly defining all of the requirements that we had from a ChMS product. Each of the requirement sections and subsections had priorities, and lists of actors that required this functionality.
For anyone else considering this kind of project, I cannot stress the importance of taking your time during this phase, and getting it done right. Resist the temptation to look at the cool features of the pretty software. Unless your pastoral people really need their brains jogged to understand the possibilities, then any time you spend looking at software will be time you don’t spend defining your needs. Trust me.
We started this process in January 2010, at which time the products I listed above existed as the list portrays. Fast forward to today, and there’s been considerable change to that list, which I think is important to convey before we go any further.
Fellowship1 was acquired by Active Network. Active had been trying to buy F1 for 3 years we’re told, and eventually the management at F1 felt that Active were heading in a complimentary direction, and could help them reach their goals faster. We generally didn’t take this into consideration too much in our evaluation, as we’ve seen acquisitions go both ways. A few months later, we got word that TheCity had been picked up by ACS, which was a fascinating turn-around, given TheCity’s very web-ie nature, and ACS’s distinct lack thereof. ACS had also picked up NSpire a few months prior, which is one of the product we’re migrating away from. I can’t think that ACS will be having an easy time managing all of their new customers and codebases… but I wish them luck.
The next acquisition was a major surprise to us. Given the number of vendors making ChMS software, and the fact that there were at least 5, robust, viable products on the market with very similar features, acquisitions and mergers were likely to take place. This is why we weren’t particularly surprised about the F1 acquisition. I also spent a few days with Blackbaud in Charleston, and they indicated they were quite interested in the ChMS space as well, whether by development or acquisition. What took us by surprise, was when Active Network went ahead and bought out a second, top-tier ChMS product: ConnectionPower.
For our particular needs, we found ConnectionPower to be the weakest of the products that we looked at, but it was a rich product nonetheless, with a strong customer base. In a period of 6 months, Active Network grabbed themselves around 3300 customers: major market share in this space. They’ve announced that they’ll discontinue the ConnectionPower product, and roll it’s unique features into Fellowship1 (which is being re-branded at some point). I’m skipping over anything else about ConnectionPower in this post, as the product is irrelevant now.
The Evaluation Process
I spent a few weeks getting to know the vendors, discussing their ideals and goals with their sales folks, and reading as much as I could find about each solution online (from the vendors, and more importantly NOT from vendors). We also reviewed pricing, technical requirements, corporate profile, customer feedback and various other things not relating to the actual useability or features of the solution. These last ones had significant bearing on our choice of solution, especially the presence of Canadian customers. We’ve learned to make this an important point for us, as have most Canadian entities looking at the US market for software; and it severely hurt the chances of a few products, specifically Shelby Arena and ThinkMinistry MinistryPlatform. At the time we evaluated, Church Community Builder also couldn’t handle financial transactions outside of the United States, but more on that later.
After that initial process, I used the Needs Assessment that I explained above, to build a scoresheet of functionality that I thought we could cover in a demonstration. We then scheduled demos with each of the vendors, supplied them with our scoresheet and full Needs Assessment, and assembled a team to evaluate the solutions. That team consisted of me, two of my technical staff (my DB guy and my training guy) as well as 2-4 pastors, depending on the day of the demo. The demos were all at least 3 hours long, some of them closer to 4 hours. During the demonstrations, I had each of the pastors with their own copy of the scorecard, marking down grades on the functionality as it was shown to them. Once the demos were done, then we compared the grades and discussed the solution at length. We settled on grades for each piece of functionality, and then reviewed the overall score and compared it to how we felt about the product in general. We awarded generous bonus points for things that smashed it out of the park.
Once we’d finished the demos (which took the better part of 3 weeks) we started comparing the solutions to one another to try and normalize our scores. We did have to go back to a couple of products and see them again, as we inevitably missed things, or didn’t realise something we should have asked beforehand. Once we’d massaged each solution’s numbers to a point we felt was fair, then I built some cunning formulas.
After we’d spent 15+ hours looking at software, it was clear that ANY of them could get the job done for us. What wasn’t clear was how each product performed overall against our specific priorities; there were so many trees we couldn’t hope to see (or even remember) the forest. I decided to take a fairly mathematical approach, the description of which you can skip over if you want. I’ll be including some PDFs and Excel sheets shortly, so you can reproduce some of this yourself, in case my describing this to you makes your eyes glaze.
Since the pastors had given us priorities for each of their requirements, we could extrapolate these out to point pools, which when combined with the scores, would produce weighted averages. Put it this way:
Priority 1 = 100 points
Priority 2 = 75 points
Priority 3 = 50 points
If “Functionality A” received a 7/10, that’s 70%, and if “Functionality A” was Priority 2 to us, then it got itself 52.5 points.
Line up the points awarded beside the max points possible, run a weighted average calculation on those columns (technically a sumproduct divided by a sum), and you’ll come up with a score referencing your priorities. We scored each individual piece of functionality this way, and then rolled those values up to subsections of functionality, which were globally prioritized and scored again. The ultimate result was a score out of a 100, for how well this product performed on the things that were most important to us. We also totalled the non-prioritized raw scores, to provide another evaluative point.
We charted all of that, so you could see the numbers of the competing products clearly. We then built out additional charts with other working-sets of functionality, such as: Young Adult Campus tasks vs Connecting a new visitor. We didn’t prioritise these ones, just looked at the vendors’ scores for each set. We also examined consistently highest scores, as well as consistently lowest scores, to see what further trends might emerge. As you’ll see in a moment, the scores were so close we really felt we had to try to push them harder to find weak spots.
Fellowship1 vs Church Community Builder vs Shelby Arena vs MinistryPlatform
The results that we came up with for each product are totally specific to our needs, a part of me is hesitant to post our grades and results online because of this. I will say this once again: the most important thing you can do, if you’re going through this process yourself, is to define your needs. Exhaustively. Because you’re going to spend good money, and a good amount of time implementing this software, and then you’re going to be married to it. We evaluated these products againts OUR needs, and so should you. With that in mind, here’s a brief overview of each one.
Beautiful, robust & powerful, F1 is the Cadillac Escalade of ChMS software. It’s customer base is the largest, and it’s well deserved. Excels in member management and reporting, especially.
Unweighted, raw totals score: 84.52%
Weighted, prioritized (smart) score: 87.72%
Church Community Builder
Friendly, powerful & organic, CCB is the Lincoln Navigator of ChMS software. It’s been around a bit longer, isn’t quiteas pretty as F1, though it makes up for this in every way (and is by no means ugly, in it’s own right). Their multi-site/multi-campus capabilities are especially well thought out, and their communications tools are feature-rich.
Unweighted, raw totals score: 87.58%
Weighted, prioritized (smart) score: 87.81%
Shelby purchased the Arena software from a church who’d developed it themselves, and are now focussing all their development on it. It’s a robust, well featured product, that was held back for us by an unclear interface and a strong “Microsoft feel”. Their communications and reporting tools were very strong.
Unweighted, raw totals score: 81.35%
Weighted, prioritized (smart) score: 76.45%
Think Ministry MinistryPlatform
The new kid on the block, these guys have built out comparable features to all the others, in a fraction of the time. We liked a lot about it, but ultimately it’s aimed more at administrative staff than pastors, and for us that was the wrong focus. It’s got enterprise constituent management written all over it, and has the most flexible family/relationships & multi-congregation concepts we’ve seen yet (make’s multi-site look one-dimensional).
Unweighted, raw totals score: 81.39%
Weighted, prioritized (smart) score: 79.70%
NeedsAssessment - Catch the Fire – April 2011 (Word Doc, generally sanitized) I’m requesting that you don’t distribute this document online yourself. It was for OUR needs, and will need significant changes to match YOUR needs, but it may be a suitable template for you.
ChMS Solutions Rating – Catch the Fire – September 2011 (Excel Workbook, somewhat sanitized)
ChMS Solutions Rating – Catch the Fire – September 2011(PDF printout of the above)
Down to the Wire
As I said earlier, any of these solutions could have worked for us in the end. The lowest score was 76%, which is hardly bad. From the scenario charts and the math, we could see that were really looking at Fellowship1 and Church Community Builder. The other solutions just weren’t quite playing the way we wanted to play. Now you’ll notice that the weighted scores of F1 and CCB are INSANELY close. When I showed these scores to both companies, I think they were each a little disturbed how closely they had scored to one another (though on different functionality, some of the time). Given how close their scores were, and how much our various staff were enamoured with either solution, I jumped on a plane and spent a day in Denver followed by a day in Colorado Springs.
All things being equal, which they very nearly were, we had to make a decision about which product would fit our culture the best. I spent 6 hours or so each with the respective staffs of F1 and CCB, met people in roughly equivalent roles, and chewed the fat as much as I could. I told some jokes, I asked hard questions, went for a drive with my account managers, and did whatever I could to find out who these guys (and girls) were. If I could have assigned a numerical score to each one, it would have looked very similar to the scores they both got above. At the end of the day, both companies are run by awesome teams, with great vision for helping the Body of Christ. For the record, I have no reason to believe the other 2 solutions are run by any-less awesome people, but I didn’t meet them myself.
In the end we chose CCB. Their smaller size felt comfortable to us, and in the time we spent together we felt that we had a closer DNA match. The lack of financial support outside the US was not an issue for us, as we have to take all our payments through The Raiser’s Edge, as it is the primary donations software that we’re using. CCB has many Canadian customers, each of which have either found a way around this, or didn’t need the functionality. CCB have a plan in place to address this, and while they didn’t commit to time frames I imagine it won’t be an issue this time next year. We signed contracts with them in August and are racing towards an October/November launch window.
If any of this has been of value to you, I’d love to hear about it in the comments. We put hundreds of man-hours work into this, because we strive to be good stewards. It’s all for God’s glory, after all.