Answering the Con Social Spending Tradeoff Argument

All PF Resources  Millennial Speech & Debate Workshops
One of the most popular Con arguments at Blake was that increased military spending would trade-off with social welfare spending, particularly for Obama care and food stamps. Teams would then read impacts that assume the complete disappearance of those programs.
There are a number of problems with this argument.
First, it is non-unique. We have a $19 trillion deficit and it increases ever year. The Republicans are going to pass massive tax cuts and Trump wants a massive infrastructure plan. Those will increase the deficit. If the Republicans are going to cut social spending if they deficit increases, they are going to do that now.

Wall Street Journal, December 5, 2016, https://blogs.wsj.com/moneybeat/2016/12/05/trump-deficit-spending-may-not-impress-the-market/
Mr. Trump has said that he will seek to boost economic growth through a series of tax cuts and fiscal spending. Already, analysts are forecasting a broad rise in corporate earnings due to tax cuts, which could have especially large impacts on sectors like banksand retailers. That could support the next leg higher in many stocks, the thinking goes. But the tax plan Mr. Trump has proposed would cause the budget deficit to expand to nearly $1 trillion in 2017 from about $590 billion this year, the bank’s researchers believe.

There is no evidence that the Republicans will increase social spending now or that they won’t cut it. There is no social spending uniqueness in a Republican world for defense spending to trade off with. Republicans already want to wipe out Obamacare, reduce Medicaid spending, and privatize Medicare. They aren’t doing this because of military spending – they just hate these programs.
Second, under the Budget Control Act (BCA), which still exists and will exist at least until the end of January, that Congress has agreed to increase social spending at the same rate as military spending. While it certainly not clear how military spending would be increased if it were (the BCA could be repealed), if military spending were increased now it would very likely result in both categories being increased.

Todd Harrison, Director, Defense Budget Analysis, Director, Aerospace Security Project and Senior Fellow, International Security Program, Center for Strategic and International Studies, August 1, 2016, What Has the Budget Control Act of 2011 Meant for Defense? https://www.csis.org/analysis/what-has-budget-control-act-2011-meant-defense
Q5: Is the BCA still in effect?
A5: Yes, but it has been modified three times since it was enacted. Just before the budget caps went into enforcement in January 2013, Congress passed a last-minute deal known as the American Taxpayer Relief Act of 2012. Among other things, this law raised the budget caps slightly for FY 2013, with equal increases on the defense and nondefense sides of the budget caps. But it paid for these increases in part by lowering the caps in FY 2014. In December 2013, Congress modified the BCA for a second time with the Bipartisan Budget Act of 2013. This two-year deal raised the budget caps for FY 2014 and FY 2015, again with equal increases for the defense and nondefense sides of the budget. Both Congress and the administration stuck to this deal, appropriating funding at the revised budget cap levels in both years, thus avoiding sequestration being triggered.

Evidence that Con teams read assumes Trump would increase military spending, but if you argue for it before January 20th, it would happen in a way that Obama supports, meaning an increase in both defense and social spending.
Third, Their impact is ludicrous — It assumes the programs are eliminated and it assumes that people have no other source of food or health care. The terminal impact is absurd even if there are some reductions as a result of increased military spending.
Fourth, military spending massively increases economic growth .These jobs would generate tax revenue through economic activity, reducing the need for social welfare.

National Conference on State Legislatures, September 9, 2016, Military’s Impact on State Economies, https://www.ncsl.org/research/military-and-veterans-affairs/military-s-impact-on-state-economies.aspx
The Department of Defense (DoD) operates more than 420 military installations in the 50 states, the District of Columbia, Guam and Puerto Rico. These installations—which may also be referred to as bases, camps, posts, stations, yards or centers—sustain the presence of U.S. forces at home and abroad. Installations located within the United States and its territories are used to train and deploy troops, maintain weapons systems and care for the wounded. Installations also support military service members and families by providing housing, health care, childcare and on-base education. The DoD contributes billions of dollars each year to state economies through the operation of military installations. This spending helps sustain local communities by creating employment opportunities across a wide range of sectors, both directly and indirectly. Active duty and civilian employees spend their military wages on goods and services produced locally, while pensions and other benefits provide retirees and dependents a reliable source of income. States and communities also benefit from defense contracts with private companies for equipment, supplies, construction and various services such as health care and information technology. The economic benefits created by military installations are susceptible to change at both the federal and state levels. Recent events such as the drawdown of troops in Iraq and Afghanistan, federal budget cuts, and potential future rounds of Base Realignment and Closure have left government officials uncertain of the future role and sustainability of military installations. These trends have been a driving force behind many states’ decisions to commission studies that define the military activity and infrastructure that exists in the state and measure the economic impact of military presence. Economic impact studies allow states to better advocate on behalf of their installations and plan for future growth or restructuring. At least 26 states have recently completed or are in the process of completing military economic impact studies. Impacts generally include salaries and benefits paid to military personnel and retirees, defense contracts, local business activity supported by military operations, tax revenues and other military spending. In 2015, for example, military installations in North Carolina supported 578,000 jobs, $34 billion in personal income and $66 billion in gross state product. This amounts to roughly 10 percent of the state’s overall economy. In Kentucky the military spent about $12 billion from 2014 to 2015, which was a reduction of $3.5 billion since the last report in 2012. With around 38,700 active duty and civilian employees, the military is the largest employer in Kentucky by more than 21,000 jobs. They also support the highest payroll with a total of $3.85 billion, $80 million higher than the second largest industry in Kentucky. Even states with relatively small military footprints have reported significant economic impacts. A study in Massachusetts, for example, found that by investing $9.1 billion in FY 2011, military installations contributed another $4.6 billion in spending and added more than 30,600 jobs to the state economy. The table below is a representation of military economic impact studies done on behalf of each of the 50 states. Most of the studies were done internally or commissioned by state organizations, while others were sourced from regional or national analyses or other publications. At least 23 states – Arizona, California, Colorado, Connecticut, Delaware, Florida, Georgia, Illinois, Kansas, Kentucky, Maryland, Massachusetts, Michigan, Missouri, Nebraska, Nevada, New Jersey, North Carolina, Oklahoma, South Carolina, Texas, Virginia and Washington – utilize numbers that were gathered by internally commissioned studies.

Fifth, undermining the global economy undermines the US economy. Extend our constructive argument that China aggression undermines international norms and the global economy.

Ian Bremmer, September 2016, Superpower: Three Choices for America’s Role in the World, Kindle edition, page number at end off card. Ian Bremmer is an American political scientist specializing in U.S. foreign policy, states in transition, and global political risk.. He’s also the President of the Eurasia Group
Still others warn that in today’s interconnected world, it’s dangerously naïve to believe that America can ever really be safe in an unsafe world. We can’t create jobs and grow our economy without a stable global economy. No nation can do more than the United States to promote and protect this better world, and it is America’s values, not its economic weight or military might, that we leave behind when the troops head home. Values that help others stand on their own. Washington, they argue, must get its financial house in order, invest in a stronger America, and pursue U.S. interests around the world. But it is shortsighted to believe that we can only build lasting strength at home by retreating from the world or by renouncing our faith in the power of democracy, freedom of speech, rule of law, and freedom from poverty and fear to create broadly shared peace and prosperity. This argument has merits too. Bremmer, Ian. Superpower: Three Choices for America’s Role in the World (Kindle Locations 475-481). Penguin Publishing Group. Kindle Edition.

Sixth, Social welfare programs don’t solve poverty

Michael Spalding, September 21, 2012, CNN, Why the US has a culture of dependency, https://www.cnn.com/2012/09/21/opinion/spalding-welfare-state-dependency/
For most of American history, the average farmer, shop owner or entrepreneur could live an entire life without getting anything from the federal government except mail service. But those days have gone the way of the Pony Express. Last year, the Wall Street Journal reported that 49% of the population lives in a household where at least one person gets some type of government benefit. The Heritage Foundation’s annual Index of Dependence on Government tracks government spending and creates a weighted score adjusted for inflation of federal programs that contribute to dependency. It reports that in 2010, 67.3 million Americans received either Temporary Assistance for Needy Families, Social Security, support for higher education or other assistance once considered to be the responsibility of individuals, families, neighborhoods, churches, and other civil society institutions — an 8% increase from the year before. These people aren’t necessarily dependent on government; many could live (even live well) without their Social Security check, Pell grant or crop subsidy. That’s not the point. The problem is that Washington is building a culture of dependency, with ever-more people relying on an ever-growing federal government to give them cash or benefits. This is a growing and dangerous trend. The United States thrives because of a culture of opportunity that encourages work and disdains relying on handouts. The growth of the welfare state, a confusing alphabet soup of programs that are supposed to help low-income Americans make ends meet and do not include entitlements such as Social Security or Medicare, is turning us into a land where many expect, and see no stigma attached to, drawing regular financial support from the federal government. Opinion: Americans are not moochers Consider means-tested social welfare programs. The federal government operates at least 69 programs that provide assistance deliberately and exclusively to poor and lower-income people. The benefits include cash, food, housing, medical care and social services. et when poverty expert Robert Rector, senior research fellow at the Heritage Foundation, examined these anti-poverty programs, he found that only two, the earned income tax credit and the additional child refundable credit, require recipients to actually work for their benefits. It had been three, but earlier this year, the Obama administration effectively set aside the most well-known welfare work requirements, those specifically written into the 1996 Temporary Assistance to Needy Families law. The Department of Health and Human Services announced that states could apply for a waiver of the law’s clearly stated work requirements. Meanwhile, although spending on welfare has been cut in half since it was reformed in 1996, other federal spending on programs, such as food stamps, has soared year after year and decade after decade. Simply put, spending on social welfare programs has exploded. CNNMoney: The poor do have jobs Under a culture of dependency, poverty becomes a trap, and recipients get stuck. Long-term welfare recipients lose work habits and job skills and miss out on the marketplace contacts that lead to job opportunities. That’s a key reason the government should require welfare recipients to work as much as they can. What could be called “workfare” thus tends to increase long-term earnings among potential recipients.
>And teams can even argue that social spending creates intergenerational poverty

Family Facts, no date, Breaking the Cycle of Welfare Dependence, https://www.familyfacts.org/briefs/46/breaking-the-cycle-of-welfare-dependence

Research on Negative Effects of Welfare Dependence: Social science research published in peer-reviewed academic journals suggests that welfare participation is associated with negative effects for children and adults and with an intergenerational cycle of dependence.

  • Welfare participation and early childhood cognitive development. A 2011 study published in Children and Youth Services Review analyzed the effect of welfare participation in the TANF program on young children’s cognitive development.4 It found that, compared to children in non-welfare families, those whose families received welfare when they were between three and five years old had, on average, lower cognitive development (about 11 percent of a standard deviation difference on the Peabody Picture Vocabulary Test). The study considered a host of factors, including child and maternal background characteristics as well as family dynamics and the home environment, but the negative effect of participation in TANF persisted. Further analysis suggested that increased maternal stress and lower household income among TANF families explained about 7 percent and 19 percent, respectively, of the cognitive deficiency between five-year-olds whose families received welfare and those whose families did not.
  • Welfare receipt and children’s educational attainment. Welfare receipt, particularly during adolescence, also appeared to have a negative effect on children’s educational attainment. Using nationally representative data that tracked individuals born between 1967 and 1978 throughout their childhood, a 2003 Demography study found that one year of welfare receipt during ages 11 to 15 was associated with a reduction of more than a quarter of a year in total schooling.5 Moreover, the likelihood of high school completion diminished by 14 percent for each year of welfare receipt between age 11 and age 15 and by 6 percent for each year of welfare receipt between age 6 and age 10.
  • Intergenerational welfare receipt. Research suggests that welfare participation may adversely affect the next generation’s economic well-being. A 2003 study in the Journal of Marriage and Family found that, compared to women whose families, when they were between age 8 and age 13, did not receive welfare, those whose families participated in welfare were more likely to receive welfare themselves.6 The effect of intergenerational welfare receipt may be partially explained by the adult children recipients’ employment, education, and marital characteristics. That is, parental welfare receipt was linked to children’s employment, education, and marital status in adulthood. For example, compared to mothers who gave birth out of wedlock, married mothers who remained married averaged three fewer years of welfare participation.
  • Intergenerational welfare dependence. Using a nationally representative survey that followed the same group of respondents from childhood through adulthood, a 2004 study on the intergenerational transmission of welfare dependence also examined the likelihood of receiving welfare in adulthood if one’s parents ever received welfare.7 Welfare receipt included participation in AFDC (the pre–1996 reform cash assistance program); food stamps; or Supplemental Security Income (SSI). The study found that the average probability of welfare participation for the entire study sample was over a quarter (0.275); if their parents had ever received welfare while the respondents were growing up, their probability of welfare receipt as adults increased to nearly one-half (0.468)—nearly three times the probability (0.166) for respondents whose parents did not receive welfare.

Seventh, regardless of their impact on poverty , political and economic pressures have given states the incentive to provide welfare to as few as possible—they use intrusive, harsh verification procedures.

Amy Mulzer, JD Columbia University School of Law, 2005, Columbia Human Rights Law Review, Summer, 36 Colum. Human Rights L. Rev. 663, “The Doorkeeper and the Grand Inquisitor: The Central Role of Verification Procedures in Means-Tested Welfare Programs,” p. 674-7
Improvements in the Medicaid and food stamp programs have been far overshadowed by negative developments in cash assistance programs. The legal-bureaucratic era ended with the passage of PRWORA in 1996, which eliminated the federal legal entitlement to cash assistance and devolved control over cash assistance (TANF) programs to the states. n52 While PRWORA requires states to “set forth objective criteria for the delivery of benefits and the determination of eligibility,” it gives them extraordinary discretion in the establishment of those criteria. n53 And while the force of tradition has led states to retain many of the procedures previously used in AFDC, various political and economic pressures have led to an increased use of these procedures as a method of “informally rationing” benefits. n54 Unable to alter eligibility levels without jeopardizing their caseload reduction credit n55 – and eager to reduce their caseloads – states are employing a different brand of verification extremism, aimed not at rooting out fraud but at discouraging claimants from applying in the first place. n56 This practice has been fueled by a new method of welfare administration, which relies less on formal policymaking and more on signaling and intimation among policymakers, administrators, and front-line workers. n57 It has also been fueled by public ambivalence towards cash  [675]  assistance programs; suspicious of these programs and their claimants, the public wants the rolls reduced, but does indeed wish to aid the poor. n58 Informal rationing allows states to reduce welfare rolls without cutting eligibility or benefit levels, leading the public to believe that the drop was caused by a genuine reduction in need.  In addition, with a degree of discretion and localization unknown since the pre-AFDC era has come an increase in the amount of outright hostility agencies display towards claimants. Some state and local agencies are using verification procedures to stigmatize and embarrass claimants, not merely to reduce the number of completed applications, but seemingly for the sake of stigma itself. n60 This approach, though not officially sanctioned by PRWORA, may be seen as following logically from signals the Act gave concerning the moral status of claimants. n61 A prime example of this approach may be found in New York City’s Human Resources Administration (HRA), as run by former mayor Rudolph Giuliani. Under Giuliani’s guidance, the agency took a highly intrusive investigative approach to verification, n62 applied documentation requirements strictly, n63 and encouraged suspicion of claimants. n64 Bucking the trend towards reliance on computer-matching in the program, HRA even mandated home visits for Medicaid applicants. In 1998, the New York Times reported the story of a woman whose application on behalf of her son was rejected after HRA investigators looked into her bedroom closet and spotted a pair of men’s blue jeans; the agency claimed that the jeans proved that the woman was still living with her spouse. n65 As journalist Nina Bernstein observes, home visits such as this one may be seen less as an attempt to weed out fraud and more as an attempt to bring back the age of man in the house rules and “midnight raids.” n66

Independently this informal rationing subverts democratic accountability by allowing politicians to roll back welfare without public knowledge.
David A. Super, Law Professor Washington & Lee University, 2004, Yale Law Journal, January, 113 Yale. L.U. 815, “Offering an Invisible Hand: the rise of the personal choice model for rationing public benefits,” p. 840-1

The relative invisibility of informal rationing devices has several consequences. Policymakers wishing to articulate one agenda and pursue another can adjust the stringency of informal rationing devices with little danger of being called on the inconsistency. At the same time it was publicly espousing fiscal discipline, the Clinton Administration made numerous changes in Medicaid and food stamp procedures to reduce claimants’ costs of participation and the risk of procedural denials. Its Office of Management and Budget (OMB) adopted the convenient position that changes allowing more already-eligible people to participate should not be regarded as increasing the programs’ costs because they were only bringing in participants whom Congress already had decided to serve when it enacted the programs’ substantive eligibility rules. n75 Conversely, as the recent economic downturn has squeezed states’ budgets, many have dropped policies adopted a few years earlier to ease procedural burdens on claimants for Medicaid. State policymakers apparently have reasoned that these changes will go largely unnoticed, or can be explained away in technical terms, while changing formal eligibility rules would be understood as a retreat from efforts to reduce the ranks of the uninsured. Yet the source of savings under both sets of policies is essentially the same: fewer people receiving Medicaid coverage. The relative invisibility of indirect methods can also allow policymakers to ration benefits for a broader array of purposes than they could readily hope to justify publicly. The upheavals of the mid-1990s did stretch the range of politically acceptable objectives for eligibility rules, at least for a while. Traditionally, however, policymakers have had to justify most eligibility rules under one of only a small handful of rubrics. Most substantive eligibility rules are explained either as measuring need for a benefit or worthiness to receive it. Once a basic need-or worthiness-based rationing system is established, arguments about equity, reliability, simplicity, or cost may lead to some fine-tuning. At some point, however, restrictions on substantive eligibility without substantive justification can expose policymakers’ failure to fulfill their own stated programmatic objectives. Discouraging participation may be a safer way to achieve the same savings. 

The failure to submit to democratic accountability threatens global survival.

Carl Boggs, National University, 1997, Theory and Society, December, Volume 26, Number 6, p. 773-4
The decline of the public sphere in late twentieth-century America poses a series of great dilemmas and challenges. Many ideological currents scrutinized here — localism, metaphysics, spontaneism, post-modernism, Deep Ecology — intersect with and reinforce each other. While these currents have deep origins in popular movements of the 1960s and 1970s, they remain very much alive in the 1990s. Despite their different outlooks and trajectories, they all share one thing in common: a depoliticized expression of struggles to combat and overcome alienation. The false sense of empowerment that comes with such mesmerizing impulses is accompanied by a loss of public engagement, an erosion of citizenship and a depleted capacity of individuals in large groups to work for social change. As this ideological quagmire worsens, urgent problems that are destroying the fabric of American society will go unsolved — perhaps even unrecognized — only to fester more ominously into the future. And such problems (ecological crisis, poverty, urban decay, spread of infectious diseases, technological displacement of workers) cannot be understood outside the larger social and global context of internationalized markets, finance, and communications. Paradoxically, the widespread retreat from politics, often inspired by localist sentiment, comes at a time when agendas that ignore or side-step these global realities will, more than ever, be reduced to impotence. In his commentary on the state of citizenship today, Wolin refers to the increasing sublimation and dilution of politics, as larger numbers of people turn away from public concerns toward private ones. By diluting the life of common involvements, we negate the very idea of politics as a source of public ideals and visions. In the meantime, the fate of the world hangs in the balance. The unyielding truth is that, even as the ethos of anti-politics becomes more compelling and even fashionable in the United States, it is the vagaries of political power that will continue to decide the fate of human societies. This last point demands further elaboration. The shrinkage of politics hardly means that corporate colonization will be less of a reality, that social hierarchies will somehow disappear, or that gigantic state and military structures will lose their hold over people’s lives. Far from it: the space abdicated by a broad citizenry, well-informed and ready to participate at many levels, can in fact be filled by authoritarian and reactionary elites — an already familiar dynamic in many lesser developed countries. The fragmentation and chaos of a Hobbesian world, not very far removed from the rampant individualism, social Darwinism, and civic violence that have been so much a part of the American landscape, could be the prelude to a powerful Leviathan designed to impose order in the face of disunity and atomized retreat. In this way the eclipse of politics might set the stage for a reassertion of politics in more virulent guise — or it might help further rationalize the existing power structure. In either case, the state would likely become what Hobbes anticipated: the embodiment of those universal, collective interests that had vanished from civil society.