Rethink District Budgeting – Part II

In previous post, my colleague Dr. Tom Aberli and I discussed three forces that shape and influence district budget decisions: 1) needs-based framework, 2) existing financial management practice, and 3) human nature. In this post, we present how we can build on the existing infrastructure and processes with a new lens and framework to overcome the challenges brought about by those forces. Specifically, we propose to: 1) differentiate between operation and investment expenditures, 2) track expenses on alignment, outcome and improvement, and 3) re-orient how improvement is practiced.

Operation vs Investment Expenditures

As noted above, needs play a prominent role in how districts prioritize new spending, but present challenges when it comes to making changes to existing spending. A new lens is needed to empower districts to routinely conduct systematic examination and realignment of their spending in a fair and transparent manner. One way to do that starts with differentiating between operation and investment expenditures.

Of the hundreds of thousands of district and school expenses, most funds are spent due to necessity. For example, school buildings’ utilities and maintenance costs need to be paid for teaching and learning to take place. Under the state and federal laws, schools are mandated to fund certain positions to provide services to students with special needs. At the same time, each school system likely has its own unique programs or services that are considered indispensable by the local community. For example, a town that prides its high school football team is unlikely to cut the program’s funding, even when the district faces a budget crunch (Starr, 2000). Dictated by legal, cultural, and political reasons, these must-spent expenses usually are not associated with any specific achievement goals and can be classified as operation expenditures.

On the other hand, school systems spend a smaller but considerable portion of resources to improve the quality of the services they provide. For example, a district might decide to invest in a campaign to boost school attendance, or provide merit pay to teachers in hope of improving student outcomes through economic incentives. With an aim in sight, these discretionary expenses are chosen by district and school leaders to produce returns in targeted areas. These expenses are classified as investment expenditures.

Compared with the compulsory nature of operation expenditures, investment expenditures are choices districts make out of many options. If a selected investment choice does not yield return, the district has the flexibility to replace it with another investment option. If the invested area is no longer an improvement priority, district leaders can reinvest the dollars to meet other new improvement needs. For example, districts can experiment with giving economic incentives to students directly for good attendance and grades (Fryer, 2011) if the investment in campaign does not lead to improvement in attendance. If attendance is no longer a priority to address, leaders can redirect the funds for other improvement needs. Table 1 below summarizes the main differences between operation and investment expenditures.

Operation ExpendituresInvestment Expenditures
PurposeNecessary for providing basic servicesOptional and elected for improving services
FlexibilityLimited flexibility in making changesMore flexible in making changes
AmountMajority of a district’s budgetSmaller but can be a significant amount
Achievement goalUsually not associated with a specific achievement goalInitiated to achieve specific goals
Improvement goalIncrease efficiencyEnsure alignment and return on investment
Table 1 Contrast between Operation and Investment Expenditures

Differentiating between operation and investment expenditures allow districts to manage funds differently for different improvement goals. With operation expenditures, the primary improvement goal is to increase efficiency without jeopardizing or undermining services. Systematic pursuit of that goal rests upon periodic benchmark analysis both horizontally and vertically. Horizontally, a district can compare itself against similar districts every a few years on an array of operation key performance indicators (KPIs), such as those published annually by the Council of the Great City Schools (2022). Under-performing KPIs point to areas where improvements can and should be made by learning from benchmarking districts with higher efficiencies.

Vertically, a district can compare schools, departments or divisions against efficiency metrics over time. For example, leaders can compare the operation expense per student by school every a few years. If the comparison reveals significant increases in one school while the operation expense per student in other schools remains flat, analysis should be conducted to find out reasons for the increase and measures that might be needed to address this change. Through tracking and comparing efficiency metrics both horizontally and vertically, districts can make informed, intentional decisions that can maintain the same level of operation expenses but provide more services, provide the same level of services but with a lower level of operation expenses, or curtail the fast increase in operation expenses.

With investment expenditures, the improvement goal is twofold. Fiscally-responsible leadership will make sure that 1) both future and current investments are aligned with improvement priorities and 2) investments are producing high returns at the lowest possible costs. If one target area is no longer a priority, districts should reallocate the resources to focus on other areas of higher priority for improvement. If an investment is still in alignment but falls short of producing the expected return, leaders should demand changes to program implementation and consider postponing new investments in this area of need until success with the existing investment is achieved. When success is still unattainable after repeated efforts, districts have the responsibility to invest these existing resources in another promising option.

A similar cycle-based approach can be taken to achieve the two improvement goals for investment expenditures. When a new investment is launched, expected return should be clearly defined to spell out target population, outcome measures, as well as baseline and goal metrics. This information provides the basis for alignment and return on investment review. Most importantly, an investment cycle (such as every three or four years) for achieving the intended goal needs to be established to set a review schedule. Altogether, this process serves to develop a shared understanding of what needs to be accomplished by when and the potential consequence if success is not attained. Continuous improvement is the primary objective in this cycle-based approach. If an investment remains aligned with improvement priorities but falls short of meeting the goal at the end of the first investment cycle, it can be put on another investment cycle with necessary adjustments made. The adjustments can include but are not limited to setting a new goal, focusing on a smaller but more targeted group of students, tightening program implementation, or even developing new strategies. For an investment that has gone through several investment cycles, however, lack of return after multiple efforts will make it increasingly more difficult to justify its continuation.

Track Expenses on Alignment, Outcome, and Improvement

A strategy to systemize a cycle-based approach to managing operation and investment expenditures is to create a system to track expenses on alignment, outcome, and improvement. It is suggested that such a system should have the following components.

The first component classifies expenses as either operation or investment. The second component documents how the expenses are aligned with the strategic plan. Next, an investment cycle can be assigned to each investment to set the schedule for alignment and return on investment review. For operational expenditures, the review cycle can be set by either operational area (e.g., accounts payable, procurement, transportation) or by specific cost center to examine efficiency based on horizonal or vertical benchmarking.  

For expenses classified as investments, two additional sets of components are still needed. One set defines the expected return from an investment and tracks who is being targeted for improvement (target population), what outcome measures will be used to gauge success (success metrics), as well as the target population’s the current performance (baseline) and expected future outcome (goal) on the success metrics.

The other set defines an investment’s theory of change by identifying the root cause of the problem and creating a logic model that “links outcomes (both short- and long-term) with program activities/processes and the theoretical assumptions/principles of the program” (W.K. Kellogg Foundation, 2004). Table 2 provides some examples of these components for investment and operation expenditures, respectively.

Generating and executing these new components apparently requires collaboration of multiple departments that include the finance, program, and accountability teams. As a result, financial management is no longer a siloed responsibility of the finance team that mainly focuses on accounting for compliance. Rather, effective and efficient use of limited resources for increasing student achievement becomes a responsibility shared by all parties involved in the process, which is how it needs to be.  Most enterprise resource planning (ERP) systems probably do not currently support tracking these components yet. School systems will need to develop something to implement the change. In Jefferson County Public Schools, we developed an online investment tracking system and link it to our ERP system MUNIS. The system is used to not only record these new components, but also support collaboration among teams and facilitate reviews. Leaders in the Saratoga Springs City School District developed a fillable PDF form for this purpose. In its simplest implementation, an Excel spreadsheet could do the job.

Re-orient How Improvement Is Practiced

Simply put, we have been trying to improve through innovation without systematically analyzing the efficacy of the adopted programs and their impact on the other strategies implemented by the district. While innovations are undoubtedly important, problems arise when we pile them up without making sure they add up. The inadvertent accumulation of innovations often begets three problems hampering our improvement efforts.

First, as noted earlier, programs brought in by different leaders tend to reflect hot-button issues and solutions of their respective times, and thus vary in focus of change, improvement strategy, and even identification of the root causes. Without careful coordination and deliberate calibration, their co-existence leads to inconsistency and confusion (DuFour, 2003). Second, with new innovative programs taking the spotlight, existing innovative programs quickly lose their luster. The lack of sustained institutional attention and support hurts their chance of succeeding that may already suffer from inadequate implementation. Third, continuing addition without subtraction overwhelms schools, drains people’s attention and energy (also finite resources), and causes innovation fatigue and indifference (Reeves, 2020).

With the new moves introduced above, leaders are afforded a different approach to improvement that emphasizes coherence and puts continuous improvement at the center (Bryk et al., 2015; Lewis, 2015). For each new spending proposal, the new approach starts with a decision on whether the proposed spending should be treated as an operation or investment expenditure. If classified as an investment expenditure, all information listed in Table 2 should be specified in the budget proposal, which is critical for not only the subsequent adoption decision but also continuous improvement decisions down the road. 

Employing the information from both the new proposal and existing investments that are being tracked, leaders’ deliberation can expand from alignment with priorities, cost, and evidence of impact to including three additional areas (Yan & Hollands, 2018). The first area looks at whether the district has already invested in programs that share the same or similar target population and target outcomes as the newly proposed program. The second area is concerned with how many active investments the department proposing the new spending is currently implementing, how well those investments are being implemented, as well as the department’s capacity to take on a new program. Last, what adjustments to which  programs are required for full implementation of the new program and how these adjustments could affect the performance and impact of either program.

Conclusions from deliberation in the above six areas (alignment with priorities, cost, evidence of impact, overlap/redundancy with existing programs, implementation capacity, and coherence with existing programs) have direct implications for whether the new program should be adopted and/or an existing program should be discontinued. This investigative process will also inform the timing of adoption and/or discontinuation of programs, which team will be put in charge of implementation, as well as what coordination and adjustments are needed from related programs and departments. In a complex system where things are interconnected and change is non-linear (Fullan, 2001; Jacobson et al., 2019), all these decisions are important for the success of not only the new program but also the existing programs that will be impacted by its implementation and require participation of all stakeholders and continual recalibrations between programs.

Following an adoption decision, it is important to ensure that the newly approved investment receives sustained attention and support to succeed, and, equally important, actions will be taken if it does not produce expected return. This is achieved through the review taking place when the program reaches its end of investment cycle (EOC) specified during the approval process. As all new technologies eventually become appliances (Bertram & Hogan, 1998), all innovations become just another program and run the risk of turning into routines with an outdated mission or motions without a clear purpose. EOC review puts an existing program from something in the rear-view mirror back to the center stage under spotlight, which it both deserves and needs for success.  

EOC review asks four big questions. First, is the investment aligned with the district’s current improvement priorities? Second, what is the actual return on investment (i.e., how many students from the intended population have been served and what are their outcomes)? Third, how has the program been implemented (i.e., how much of the budgeted money has been spent and which planned activities have taken place)? Last, if the program does not meet its goal, what would it take to help the investment succeed (e.g., increasing intentionality by focusing on a smaller student population, better collaboration and coordination from other departments, lessening schools’ burden by reducing the number of new initiatives they are asked to implement)?

Notedly, EOC review differs from end of year (EOY) program review conducted by program staff annually.  The latter usually takes place within a department involving program director and staff and focuses on programmatic changes inside the department to improve implementation. Applying a systemic lens, EOC review requires participation of senior leaders across divisions, focuses on alignment and return on investment, and could result in changes in scope, scale, funding, as well as implementation for not only a reviewed program but potentially other programs. Table 3 shows the difference between EOC review and EOY review.

EOC ReviewEOY Review
LevelSystem levelProgram level
FrequencyInvestment cycleAnnually
ParticipantsProgram director and senior leaders across divisionsProgram director and staff within a department
FocusAlignment, ROI, and coherenceImplementation
ChangeScope, scale, funding of both a reviewed program and potentially other programsProgrammatic and within a department
Table 3 Differences between EOC review and EOY review

Ideally, three things can be achieved from an EOC review process: 1) investments that are ineffective after multiple improvement efforts or no longer aligned with priorities are re-invested in other improvement options; 2) investments that have not yet delivered returns but are worth more time or need another opportunity are put on a path to success, with criteria and timeframe set for the next review; and 3) investments with high returns are recognized and potentially replicated or expanded to benefit more students.


The annual budgeting process is a valuable opportunity for district leaders to 1) systematically examine both resource use and district programming and 2) use the findings to inform and drive change to optimize resource use and improve program efficacy that will lead to increased student achievement. However, we have not been able to capitalize on the full potential of this opportunity due to three forces.

With a needs-based framework, each established expense is first legitimized and then perpetuated for meeting a high-priority need. When it is difficult to identify and justify one need being more important than the other, adjusting resource use becomes very challenging if not a mission impossible. The accounting system districts rely on for managing their financial resources embodies this lack of differentiation between improvement needs and operational needs, and creates departmentalized space for siloed practices. As a result, there is lack of a fair and transparent structure for leaders to systematically review resource use and program impact as well as make nuanced flexible adjustments based on the analysis in a collaborative way. Coupling these two conceptual and infrastructure constraints is our human nature, which favors addition over subtraction for problem solving and biases against subtractive changes when it comes to decision making.

To overcome these challenges, district leaders can integrate three new moves to their existing “budget dance”. To start, district expenses need to be differentiated between operation expenditures and investment expenditures. In fact, we are already making the distinction when we pose the question “does it work” for some spending items but not the others, only without making the classification decision explicit and tied with continuous improvement implications that involve potential changes in program budget and implementation.

Differentiating between operation and investment expenditures lays the conceptual foundation for improvement. Tracking expenses on alignment, outcome, and improvement turns the improvement opportunity into concrete tasks and processes that can be executed and, more importantly, sustained. Specifically, it helps systemize continuous improvement in three ways. First, collecting and using data described in Table 2 adds rigor and discipline to the decision-making process, which should lead to better decisions on adoption, planning, implementation, evaluation, and discontinuation of investments. Second, by linking financial data with outcome data and program activities through a logic model, it brings together all parties needed for making an investment successful to ensure efforts are concerted, coherent, and calibrated throughout the implementation and, eventually, resources are allocated based on alignment with priorities and return on investment. Third, it helps make the new “budget dance” moves annual routines, regardless of the political environment and even when the district is flush with money.

Traditionally, adopting a new program or launching a new initiative has been employed as the primary means of improvement. The new budget dance moves discussed in the paper promote and foster an improvement approach that is a bit more conservative about addition. Leaders should take extra precautions when adopting new programs and interventions to avoid the “presence of too many disconnected, episodic, piecemeal, and superficially adorned projects”, which is a bigger problem than absence of innovation (Fullan, 2001, p. 109). It is important to understand that adopting a new program is not an isolated improvement event. Leaders should take advantage of the information collected through the tracking to make sure new investments are congruent with the overall strategy and complement and support the existing programs.

Oftentimes, the decision of approving a new investment marks the end of engagement from district leaders. Other than talking about it at a high level, they usually take a hands-off approach and delegate most, if not all, responsibilities to the program director and staff. While it is appropriate to leave execution to mid-level administrators, it is district leaders’ responsibility to set expectations, provide the condition needed for the investment’s success, and take actions if the investment fails to produce return. Leaders should seize the opportunity created by EOC review to re-engage with the investment of their or previous administration’s decision and make new decisions that are important for improving its impact or lead to better use of the resources.

District leaders are often accused of wasting money on things that do not work instead of spending limited resources on research-proven programs. However, improvement is more than replacing an ineffective program with another innovation. In many situations, continuously improving a lack-of-return program is a better strategy than simply cutting it. Discontinuing an ineffective program is not a solution but can sometimes cost significant institutional energy and leaders’ political capital. In reality, it is unlikely that a program works for no one (Yan & Slagle, 2011). Already interwoven with the system, many times, what the program needs to succeed is more focused and better executed implementation, which require district leaders to remain engaged and pull levers that only they are capable of. (e.g., narrow the focus to students for whom the program is most effective based on evidence and ask other programs to make adjustments). If the program fails to deliver return after repeated efforts, leaders can use the cumulative evidence documented in multiple EOC reviews to communicate and justify the discontinuation decision. While budgeting is the focal point of the discussion, we should not lose sight that money is not the central issue here. Ultimately, it is about how district can develop disciplined “self-management of improvement” (Elmore, 2003) to provide coherent and coordinated services to students. These new “budget dance” moves offer new hope and possibilities.


Adams, G., Converse, B. A., Hales, A., & Klotz, L. (2022, February 4). When Subtraction Adds Value. Harvard Business Review.

Baker, B. D. (2016). Does Money Matter in Education? Second Edition. In Albert Shanker Institute. Albert Shanker Institute.

Barshay, J. (2019, December 16). What 2018 PISA international rankings tell us about U.S. schools. The Hechinger Report.

Bertram, B., & Hogan, M. (1998). The Disappearance of Technology: Toward an Ecological Model of Literacy. In D. Reinking, M. C. McKenna, L. D. Labbo, & R. D. Kieffer (Eds.), Handbook of Literacy and Technology: Transformations in A Post-typographic World (pp. 269–281). Routledge.

Binkley, C., & Foley, R. (2021). Flush with COVID-19 aid, schools steer funding to sports. AP NEWS.

Bryk, A. S., Gomez, L. M., Grunow, A., & LeMahieu, P. G. (2015). Learning to Improve: How America’s Schools Can Get Better at Getting Better. In Harvard Education Press. Harvard Education Press.

Council of the Great City Schools. (2022). Managing for Results in America’s Great City Schools, 2022, Results from Fiscal Year 2020-2021.

DuFour, R. (2003). Central-Office Support for Learning Communities. The School Administrator, 60(5), 13–18.

Dynarski, M. (2017, March 2). It’s not nothing: The role of money in improving education. Brookings.

Elmore, R. (2003). Knowing the Right Thing to Do: School Improvement and Performance-Based Accountability. National Governors Association Center for Best Practices.

Fryer, R. (2011). Financial Incentives and Student Achievement: Evidence from Randomized Trials. Quarterly Journal of Economics, 126(4), 1755–1798.

Fullan, M. (2001). Leading in a Culture of Change. Jossey-Bass, 989 Market Street, San Francisco, CA 94103-1741 ($25).

Fullerton, J. (2021). Bridging the Gaps in Education Data (p. 9). American Enterprise Institute.

Goldstein, D. (2019, December 3). ‘It Just Isn’t Working’: PISA Test Scores Cast Doubt on U.S. Education Efforts. The New York Times.

Handel, D., & Hanushek, E. A. (2022). U.S. School Finance: Resources and Outcomes (SSRN Scholarly Paper No. 4306724).

Hanushek, E. A. (1997). Assessing the Effects of School Resources on Student Performance: An Update. Educational Evaluation and Policy Analysis, 19(2), 141–164.

Hartman, W. T. (1988). School District Budgeting. Prentice Hall.

Jackson, C. K. (2018). Does School Spending Matter? The New Literature on an Old Question. NBER Working Paper No. 25368. In National Bureau of Economic Research. National Bureau of Economic Research.

Jacobson, M. J., Levin, J. A., & Kapur, M. (2019). Education as a Complex System: Conceptual and Methodological Implications. Educational Researcher, 48(2), 112–119.

Kardish, C. (2015, June 22). What Happens When Schools Stop Providing Buses? Governing: The Future of States and Localities.

Kavanagh, S. C., & Levenson, N. (2017). Academic Return on Investment: Foundations and Smart Practices. Government Finance Officers Association.

Kopsa. (2022, December 12). The City That Kicked Cops Out of Schools and Tried Restorative Practices Instead. In These Times.

Lewis, C. (2015). What is improvement science? Do we need it in education? Educational Researcher, 44, 54–61.

National Center for Education Statistics. (2019). Education Expenditures by Country (p. 8).

National Center for Education Statistics. (2020). Digest of Education Statistics, 2020. National Center for Education Statistics.

Olson, L. (2007, June 26). Harvard Project Boils Down Ingredients for District Success. Education Week.

Pierce, P. R. (2018). The Origin and Development of the Public School Principalship. Creative Media Partners, LLC.

Reeves, D. B. (2020). The Learning Leader: How to Focus School Improvement for Better Results. ASCD.

Roza, M. (2022). Time to Change the District Budget Dance. School Business Affairs, November.

Schmidt, J. (2008). History of School Counseling. In Hardin L.K. Coleman & Christine Yeh (Eds.), Handbook of School Counseling (pp. 3–13). Routledge.

Shand, R., M. Leach, S., M. Hollands, F., Chang, F., Pan, Y., Yan, B., Dossett, D., Nayyer-Qureshi, S., Wang, Y., & Head, L. (2022). Program Value-Added: A Feasible Method for Providing Evidence on the Effectiveness of Multiple Programs Implemented Simultaneously in Schools. American Journal of Evaluation, 43(4), 584–606.

Sparks, S. D. (2019, October 30). “No Progress” Seen in Reading or Math on Nation’s Report Card. Education Week.

Starr, L. (2000, May 31). Football or Full-Day Kindergarten? —Budget Cuts Force Tough Choices.

STN Editor. (2014, March 6). Updated: Amended Indiana Bill Allowing School Bus Advertising Passes Senate. School Transportation News.

Strategic Data Project. (2020). The Hidden Truth in ROI.

Strategic Data Project. (2021). Fulton County Schools Follows the Evidence.

Thaler, R. H., & Sunstein, C. R. (2009). Nudge: Improving Decisions About Health, Wealth, and Happiness. Penguin.

Urban Institute. (2017, October 23). Elementary and Secondary Education Expenditures. Urban Institute.

US Department of Education. (2021). American Rescue Plan Act of 2021: Elementary and Secondary School Emergency Relief Fund (ARP ESSER).

Weiler, S. C., & Cray, M. (2011). Police at School: A Brief History and Current Status of School Resource Officers. Clearing House: A Journal of Educational Strategies, Issues and Ideas, 84(4), 160–163.

W.K. Kellogg Foundation. (2004). Logic Model Development Guide.

Yan, B. (2020). Re-engineering Choice Architecture to Improve Budget Decisions. School Business Affairs, 8–11.

Yan, B., & Hollands, F. (2018). How to Prioritize Programs for Funding Decisions? School Business Affairs, September, 10–13.

Yan, B., & Slagle, M. (2011). The Limited Value of “What Works” Research. Education Week.

Leave a comment

Your email address will not be published. Required fields are marked *