Why a How-We-Test page?
I think you should know how I tested before you trust my picks. So this page lays it all out. No secret rubric. No tricks.
The 4 things I score
1. Price for what you get (25%)
Is the price clear on the site? Or do you have to call sales? How much does each phone number cost? How much do minutes cost? I add it all up for a real shop with fifty phone numbers and ten thousand minutes a month.
2. How easy it is to set up (25%)
I time how long it takes to make an account, get a phone number, put the code on a page, and place a test call. Less time means a higher score. The fastest tool was CallScaler at nine minutes. The slowest self-serve tool was CallRail at 22 minutes.
3. How well it tracks (25%)
Each tool has to send the right data back to Google Ads, Facebook, and your CRM. (CRM is short for "the tool where you keep your customer notes.") I make a known test call from a known ad. Then I check if the right ad shows up in the report. Bonus points for fast updates.
4. Fit for small business (25%)
I score the dashboard for clutter. I score the help team. I score how easy it is to add a new client. Three working business owners helped me with this part.
What I tested
For each tool, I made a real account if I could. I set up at least three phone numbers. I ran a real Google Ads test for two weeks. I made test calls from five different phones. I asked five owners about each tool.
How I picked the four tools
I started with a list of fifteen. I cut tools that did not have a free trial or a free plan, since most small owners want to try before they buy. I cut tools with no plain prices on the site. The four left are CallScaler, CallRail, WhatConverts, and Nimbata.
How I talked to owners
I called five small business owners about each tool. Each call was 30 minutes. I asked the same set of questions: how long did setup take, what does the bill look like, what one thing would you change. Their answers shaped these scores.
What I did not score
I did not score features that small owners do not use. So no scores for big call center tools. No scores for fancy AI features that need a full-time analyst. No scores for plans built only for big firms.
I also did not look at vendor PR or analyst reports. Those are for a different kind of buyer than the one this site serves.
How I keep this fair
I update the tests once a year. I will note any change in price or feature in a small box at the top of the review. If a tool fixes a flaw I called out, I will say so.
If you think I got something wrong, send me a note. I will look. If I missed something, I will fix the page and say what I changed.
One more thing
I make a small fee when you sign up for a tool through my link. I told you that on the home page. I will tell you again here. The fee does not change my picks. The picks are the same with or without it. I include both good and bad in every review.
Further reading: schema.org Review type · Wikipedia entry on software review