I have seen as much as a 25% variance in the quantity of distinct clients seen for the same basic year depending on the exact dates chosen. It doesn’t matter whether you are using Medicat, Point and Click, Pyramid, Titanium Schedule, or any other software. One embedded report might say 4,000 distinct clients, but then change the dates about 6 or 8 weeks, and the same report might say 5,000 distinct clients. The cause of this issue is that causes this is that you are including people from partial terms (i.e., quarter, semester or year).
To visually see the problem, consider the following diagram. It shows the schedules of 3 students (A, B and C). The green boxes represent years, which are labeled along the timeline at the bottom. Student A graduated at the end of the term ending in 2017. Student B started school in 2016. Student C graduates at the end of the term ending in 2018. The blue ovals represent the date ranges when the counseling center saw the student. The red box represents a 1-year term in which we are trying to count how many distinct clients were seen. The first illustration shows just student A and student C were seen during the one year period starting in 2016 and ending in 2017.
Watch what happens if we pick a date range a little after the previous one. Now the red box contains appointments for all 3 students. The count of distinct clients seen would say 3 students, even though only 2 students were really seen during the 2016-2017 academic year. So, it is now easy to see how easy it is to create a large error in the count of distinct clients seen (here it is a 33% overstatement).
The majority of students start their school experience in the fall and end their school experience after the spring term in some future year. Students also take more classes during fall and spring terms than over the summer. Finally, I noticed that students who start counseling (such as group sessions) in the fall have a higher tendency to continue counseling in the spring, as opposed to students who start counseling in the spring tending to continue counseling in the summer or fall. All of this means that the same effect as shown above happens if you calculate the quantity of students at the beginning of spring term rather than the quantity of students at the beginning of fall term.
In general, the closer you get to the center of a term, the worse the distortion will be. This is especially true of the center of quarters or semesters, but even, to a lesser effect, the center of years. So, if you are only looking a day or two into the start of a year, the only distortion is going to be from a few of the clients seen in the first day or two of the added year. Likely any clients with an appointment in the first day or two of the cut short year will have some other appointment that year, so will not be subtracted from the count as easily as new term specific clients will be added.
If you would like to verify this effect with your own data, run a report that gives the quantity of distinct clients seen for the following date ranges, and see if they come close to matching:
So, how do you get the closest to an accurate count of distinct clients seen?