Topic > The Origins and Some Findings of Pbf Models

Across the nation, institutions of higher education are pitted against each other over poor state funding. Colleges and universities subject to performance-based funding (PBF) models have seen mixed results, and some are skeptical of its long-term viability. This article attempts to present the origins and highlights some results of PBF models. Say no to plagiarism. Get a tailor-made essay on "Why Violent Video Games Shouldn't Be Banned"? Get an original essay First appearing in the Tennessee Commission on Higher Education, the first forms of performance-based financing (PBF) began to take flight in 1978 and by mid-2000, at least 26 states have put these policies into practice (Harnisch, 2011). Compared to current PBF model measurements; planning and design, measurement indicators and the amount of allocated funds highlight some differences compared to their historical counterparts (Thornton & Friedel, 2016). However, the original PBF methods placed greater emphasis on completion and transfer rates, leaving galvanized advancement and retention initiatives overlooked (Dougherty & Natow, 2010). It was also found that the original PBF studies were more directly related to overall program effectiveness rather than student outcomes or institutional performance (Dougherty & Reddy, 2011). Furthermore, the original PBF plans were designed and implemented without the support of educational leaders, resulting in a disconnect between state policy and educational institutional missions (Thornton & Friedel, 2016). Due to the failure to incentivize educational institutions to change, the original PBF methods were allowed to lapse (Miao & Ju, 2012). In Florida, colleges are rated in a series of ten categories that include: first-time-in-college (FTIC) retention rates with a GPA above 2.0, FTIC six-year graduation rates, four-year transfer graduation rate AA, percentage of bachelor's degrees with excessive hours, percentage of bachelor's degrees employed full time or in higher education, bachelor's degrees in STEM, bachelor's degrees in STEM fields, average cost per degree bachelor's degrees, median salary for bachelor's degrees, and number of bachelor's degrees awarded to minorities (Frank, 2016). For the 2018-2019 fiscal year, the Florida Board of Governors allocated $560 million in total PBF funds across 11 institutions (Florida Board of Governors, 2018). The move to tie educational performance and standardized metrics has presented arguments from educators, students and lobbyists. Opponents argue that performance-based funding can lead to reductions in educational access and diversity for the public, that general performance-based funding efforts for education are not indicative of higher educational outcomes, and that Performance-based funding will lead to the professionalisation of education. (Kowal, n.d.). Nicholas Hillman, an assistant professor of education at the University of Wisconsin at Madison who has studied such state formulas, argues that performance-based funding is rarely effective. Stating that “performance-based funding schemes are more likely to work in non-complex situations where performance is easily measurable, tasks are simple and routine, objectives are unambiguous, employees have direct control over the performance process production and there are no more people involved in theproduction". the result” (Inside Higher Ed, 2016). Findings from the Center on Budget and Policy Priorities found that, in the aftermath of the 2008 recession, some states closed approximately $425 billion in budget deficits (Schoen, 2015). Many states have felt these cuts to education subsidies as the Center reported that states cut spending to $2,353 per student – ​​a 28% decrease from 2008 (Schoen, 2015). Arizona and New Hampshire saw a 50% cut in student spending; eleven states cut by 33% (Schoen, 2015). These cuts have further exacerbated the mechanisms in place for universities to receive funding, placing greater emphasis on rigid performance measures. But as the economy continues to improve, thus increasing tax revenues, is performance-based funding hindering the education students receive? John W. Schoen, a business reporter for CNBC, said that on average, students are not receiving much of a difference in their education compared to 15 years ago (Schoen, 2015). Some universities have actually increased spending on student-facing amenities, such as gymnasiums, recreation centers, and dormitory improvements to attract students into the education market. Regardless of school spending, the reality remains that subsidies have been cut on the public side, leaving students and families to pay a greater share of the expense, and private schools further widening the wealth gap. According to the College Board, the average enrollment rate has increased about 5 percent over the past decade, 3.7 percent for private colleges and 2.9 percent for public colleges between 2014-2015; a rate substantially higher than general inflation and household incomes (Saving for College, 2016). Recent state formulas for PBF have had little research into their effects and outcomes as their use is still relatively new (Dougherty JK, 2013). It is argued that their primary focus is on student achievement. However, the institutional impact has been overlooked (Dougherty JK, 2013). In a qualitative study conducted by Iowa State University, Zoe Mercedes Thornton and Nahra Fridel provide a synopsis of the organizational impact of PBF on four small, rural community colleges. Their findings indicated that performance-based policies influenced college operations, improvement efforts, and perceptions of the school (Thornton & Friedel, 2016). In a 2013 study that examined 25 states and spanned a 20-year span, it was found that PBF had an overall low effect on degree completion (Tandberg & Hillman, 2013). This study also found that it took five years from the time of implementation for the disparity in graduation rates to be seen (Thornton & Friedel, 2016). Looking at states with community colleges, nine out of 18 states saw little to no increase in degree completion, while only four out of 18 appeared to have statistically significant increases (Thornton & Friedel, 2016). Associate professor of higher education at Columbia University's Teachers College, Kevin J. Dougherty and researcher Vikash Reddy examined 60 studies related to PBF measures and their impact on educational institutions. They found that although PBF had institutional effects such as changes in funding, control in data planning, and programmatic and service changes, there was no concrete evidence to suggest that PBF significantly increased graduation rates, completion of remedial courses, nor retention rates (Dougherty and Reddy,2011). In a Texas news outlet, Claire Cardona writes about the disadvantages of PBF, pointing out that no formula can measure everything a college does (Cardona, 2013). His article highlights the fear of Texas University President Ray Keck, who believes that a significant portion of institutional efforts are not considered and may even be ignored in PBF measurement models (Cardona, 2013). Policy analyst Thomas L. Harnisch indicates that when institutions begin to define their goals based on the performance fund, there is a risk of compromising access, equity, institutional mission, and stability (Harnisch, 2011 ). A study by Ben Jongbloed and Hans Vossensteyn of the University of Twente in the Netherlands compared international government policies for PBF between Australia, Belgium (Flanders), Denmark, France, Germany, Japan, the Netherlands, New Zealand, Sweden, United Kingdom and United States. They described the mechanisms used for university funding and the extent to which grants to universities are performance-oriented (Jongbloed & Vossensteyn, 2001). Their study found that most countries had research-oriented rather than teaching-oriented outcomes, particularly as a result of research council funding (Jongbloed & Vossensteyn, 2001). When examining the United States, their study indicated that the primary reason for using performance indicators was as a measure of accountability (Jongbloed and Vossensteyn, 2001). Although there are incentive funds allocated in higher education budgets, these incentives have been found to be relatively small (Jongbloed & Vossensteyn, 2001). When comparing teaching funding measured by performance, it was found that only three countries followed this practice; Australia, Germany and New Zealand (Jongbloed & Vossensteyn, 2001). Regarding funding for teaching, the article indicated that governments are still reluctant to link resources with enrollment-based measures (Jongbloed & Vossensteyn, 2001). This could be attributed to the belief that performance should be understood in terms of increasing diversity and responsiveness to students' needs (Jongbloed & Vossensteyn, 2001). The study also indicated that if college grants were tied to enrollment numbers, institutions would be more likely to follow their customer base and forgo academic excellence in pursuit of customer incentives (Jongbloed & Vossensteyn, 2001). Jung Cheol Shin, published a study in which changes in institutional performance were measured following the adoption of new PBF-based accountability measures. Measurements were based on representative graduation rates and levels of federal research funding (Shin, 2010). Based on ten years of data collected, the study indicated that states that adopted performance-based accountability did not experience a notable increase in institutional performance (Shin, 2010). The study also indicated that although PBF initiatives did not result in improved institutional performance, states may, in fact, fail to fully implement the reform-related components (Shin, 2010). It was also found that new initiatives may not have included the support needed to make targeted changes within the institution and performance, resulting in faculty not improving teaching and research performance (Shin, 2010). It has been discovered that factors external to the political sphere also explain changes in institutional performance and therefore we are called into questioning the validity of the.