The Use of Longitudinal Data on Social Security Program Knowledge

Social Security Bulletin, Vol. 79 No. 4, 2019

The Social Security Administration (SSA) supplements a National Institute on Aging grant that funds a longitudinal Internet panel study to measure public knowledge about the Social Security programs. This article briefly reviews SSA's past efforts to gauge public knowledge of the programs, describes the Understanding America Study (UAS) panel used in the current effort, and presents results of wave 1 and wave 2 of the UAS surveys that focus on Social Security knowledge with detail by respondent age, education, and financial literacy level. Our findings indicate that younger workers with lower levels of education and financial literacy are logical targets for agency informational outreach and interventions.

Laith Alattar, Matt Messel, and David Rogofsky are with the Office of Research, Evaluation, and Statistics (ORES), Office of Retirement and Disability Policy (ORDP), Social Security Administration (SSA). Mark Sarney is the acting director of the Office of Research, ORES, ORDP, SSA.

Acknowledgments: The authors thank Anya Olsen, Richard Chard, and Kristi Scott for their helpful comments and suggestions.

The findings and conclusions presented in the Bulletin are those of the authors and do not necessarily represent the views of the Social Security Administration.


Selected Abbreviations
RCT randomized controlled trial
SSA Social Security Administration
UAS Understanding America Study

Many federal programs and services promote the health, safety, and economic security of the American public. Individuals must be knowledgeable about the programs and services offered to make the most use of them. Since the 1990s, the Social Security Administration (SSA) has regularly evaluated public knowledge of its retirement and disability programs. Most recently, the agency has funded a longitudinal study of program knowledge using the Understanding America Study (UAS). This longitudinal research may enable SSA to expand its understanding of the public's program knowledge in a number of ways. These include:

In this article, we present results from the first two waves of the UAS survey on Social Security program knowledge. This research sheds light on how the level of program knowledge varies across the life cycle. We begin by documenting historical SSA efforts to gauge the public's program knowledge. We then provide an overview of the UAS, highlighting the opportunities offered by a longitudinal study of program knowledge. After presenting initial UAS results, we discuss SSA's possible next steps in using the longitudinal study to measure public knowledge and tailor effective communication efforts.

Literature Review

Along with pensions and private savings, Social Security forms the metaphorical three-legged stool of retirement security (DeWitt 1996). For many, Social Security is the primary source of retirement income (SSA 2016). Understanding whether one is eligible for Social Security benefits, when to claim those benefits, and how much income to expect from them affects work and savings decisions before retirement (Gustman and Steinmeier 1999; Rohwedder and van Soest 2006)—and those decisions in turn affect the level of income in retirement. Social Security program knowledge thus plays an important role in retirement security. For decades, SSA has worked to better inform the public about its retirement and disability programs.

Past Efforts

In 1995, SSA undertook the largest effort in its then-60-year history to inform the public about its retirement program by introducing the Social Security Statement. The annual Statement provides projected estimates of the monthly benefits that a worker will receive based on his or her earnings history and age at claiming, along with a summary explanation of the benefits.1 SSA then commissioned the Gallup Organization to conduct a series of cross-sectional surveys between 1998 and 2001 to gauge public knowledge of the Social Security programs. These surveys widely expanded the agency's understanding of the public's program knowledge. For instance, Smith and Couch (2014) analyzed the Gallup surveys and found that many younger workers understood the basics of the retirement and disability programs but did not understand certain aspects such as how benefits are calculated.

The Gallup surveys also shed light on the effectiveness of the Social Security Statement in increasing program knowledge. Although the surveys were cross-sectional, different iterations took place before and after the Statement was introduced, providing researchers with a natural experiment to test changes in populationwide knowledge.2 Cook, Jacobs, and Kim (2010) found evidence that the Statement increased program knowledge. Smith and Couch (2014) found that the Statement particularly improved knowledge among the population with low levels of education. Other researchers tested the effectiveness of the Statement using the Health and Retirement Study (HRS), which collects longitudinal data but provides limited measurement of program knowledge.3 As in the Gallup-based studies, Mastrobuoni (2011) found that the Statement increased program knowledge. Conversely, Armour and Lovenheim (2016) found that some Statement recipients misunderstood the presentation of projected benefits. Biggs (2010) reported more ambiguous findings and suggested further research is needed to understand both the public's knowledge about Social Security and the effectiveness of the Statement in shaping that knowledge.

More recently, SSA further explored program knowledge using Internet panel studies. Funded by an SSA Retirement Research Consortium grant, Liebman and Luttmer (2015) conducted a randomized controlled trial (RCT) using a large Internet panel called Knowledge Networks (now known as the GfK KnowledgePanel) to see how an informational intervention might affect retirement behavior. They found that sending a brochure to persons aged 55–70 with information about the retirement earnings test (which applies to those who work after claiming benefits) increased employment by 4.2 percentage points. SSA also developed a program-knowledge survey as part of the American Life Panel (Greenwald and others 2010), which yielded substantial information about program knowledge.4


In its latest effort, SSA is funding a program-knowledge survey as a component of the UAS, an Internet-based panel managed by the University of Southern California. The UAS panel is a representative sample of approximately 8,000 U.S. households.5 Researchers use an address-based sample to recruit panel members. Tablet computers and Internet access are provided to participants who need them. Panel members may choose to participate in a number of surveys covering a wide range of topics, for which they receive nominal compensation. Researchers administer the Social Security program-knowledge survey on a rolling basis every 2 years. The protocol is to administer the survey to all new panel members or to any panel member who has not taken that survey for 2 years. Researchers use the Census Bureau's Current Population Survey Annual Social and Economic Supplement as the benchmark for weighting. The reference population for the UAS pool of respondents is the U.S. population aged 18 or older excluding military personnel and institutionalized individuals.6

The UAS offers a number of advantages for researching public knowledge of federal programs such as Social Security. For example:

In this study, we use the first two waves of the Social Security program-knowledge survey to extend previous research on how the public understands the programs. Wave 1 is designated by UAS as survey 16 (UAS 16) and wave 2 is designated as UAS 94. We address the following questions:

  1. How knowledgeable is the population about basic aspects of Social Security?
  2. Does populationwide knowledge change over time?
  3. How does an individual's knowledge vary across the life course?
  4. Within age groups, how does knowledge vary by individual characteristics such as education and financial literacy?


More than 5,000 UAS panel members completed the UAS 16 Social Security program-knowledge survey, providing an overall response rate of 85.4 percent of the total UAS panel. We restrict our sample to individuals aged 25–65 who completed both UAS 16 and UAS 94. At the time of analysis, we had access to complete second-wave data for one UAS sampling batch of 1,279 panel members.8 Of the 929 participants who completed the first wave in 2015, 724 also completed the second wave in 2017 (a 77.9 percent follow-up response rate).9 If the characteristics of panel members who completed both waves differ in meaningful ways from those of members who completed only the first wave, the measures of program knowledge may be biased. Table 1 compares the demographic characteristics of panel members who completed only the first survey wave with those who completed both waves. The demographic variables used in this analysis derive from UAS 1, which focused on cognitive abilities, financial literacy, and psychology. Only the age distribution of the groups differs significantly, in that younger panel members (aged 25–35) are less likely to have completed both waves.10

Table 1. Weighted characteristics of UAS respondents (in percent) by survey-wave participation: 2015 and 2017
Characteristic Wave 1 only Both waves 1 and 2 Percentage-point difference
Number 205 724 . . .
Percent 22.1 77.9 . . .
Men 47.2 50.2 -3.0
Women 52.8 49.8 3.0
25–35 29.3 18.9 10.4*
36–54 43.4 44.2 -0.8
55–65 27.3 36.9 -9.6
Less than high school diploma 10.4 8.1 2.3
High school diploma 34.5 29.9 4.6
Some college 23.7 24.2 -0.5
Bachelor's degree or higher 31.4 37.8 -6.4
White (non-Hispanic) 58.3 64.5 -6.2
Black (non-Hispanic) 14.9 15.9 -1.0
Other non-Hispanic 5.2 2.1 3.1
Hispanic or Latino 21.6 17.6 4.0
Marital status
Married 54.9 63.1 -8.2
Other 45.1 36.9 8.2
Employment status
Working 82.9 83.3 -0.4
Other 17.1 16.7 0.4
Mean annual income ($) 48,719 55,810 -7,091
SOURCE: UAS 16 and UAS 94; data on marital status and employment status are from UAS 1.
NOTES: Rounded components of percentage distributions do not necessarily sum to 100.0.
. . . = not applicable.
* = statistically significant at the 0.05 level.

The program-knowledge survey covers respondents' understanding of Social Security program basics and of benefit-claiming age (and its effect on benefit amounts) in particular. In this study, we focus on knowledge of program basics in nine different subject areas. Box 1 shows the Social Security program aspects covered in the survey, the wording of the associated questions, the response options, and the correct responses.

Box 1. Social Security program aspects and the specific survey questions that measure respondents' knowledge of them
Aspect Question and answers
Age adjustment The amount of Social Security retirement benefits is not affected by the age at which someone starts claiming.
☐ True ☑ False
Benefit calculation Which of the following best describes how a worker's Social Security benefits are calculated?
☐ They are based on how long you work as well as your pay during the last five years that you are employed;
☑ They are based on the average of the highest 35 years of your earnings;
☐ They are based on how much Social Security taxes you paid;
☐ They are based on your income tax bracket when you claim benefits
Child survivor benefits If a worker who pays Social Security taxes dies, any of his/her children under age 18 may claim Social Security survivor benefits.
☑ True ☐ False
Claiming upon retirement Social Security benefits have to be claimed as soon as someone retires.
☐ True ☑ False
Disability benefits Workers who pay Social Security taxes are entitled to Social Security disability benefits if they become disabled and are no longer able to work.
☑ True ☐ False
Inflation adjustment Social Security benefits are adjusted for inflation.
☑ True ☐ False
Payroll tax Social Security is paid for by a tax placed on both workers and employers.
☑ True ☐ False
Spousal benefits Someone who has never worked for pay may still be able to claim benefits if his or her spouse qualifies for Social Security.
☑ True ☐ False
Widow(er) benefits If a worker who pays Social Security taxes dies, his/her spouse may claim Social Security survivor benefits only if they have children.
☐ True ☑ False
SOURCE: UAS 16 and UAS 94 questionnaires.
NOTES: Some of the questionnaire's wording has been slightly modified for contextual clarity.
Correct answers indicated by ☑.

We measure program knowledge among three age groups that correspond with the age ranges for which SSA provides different versions of the Social Security Statement: 25–35 (young workers), 36–54 (midcareer workers), and 55–65 (workers near retirement age). Within these age groups, we also investigate variation across two broad educational attainment categories (high school diploma or less and some college or more)11 and two levels (high and low) of financial literacy as determined by a 14-item UAS assessment derived from questions developed by Lusardi and Mitchell (2017). Each of these variables derives from UAS 1. The financial literacy assessment tests respondents' knowledge of annuities, individual retirement accounts, and life insurance policies, among other topics. A score at or above the sample median indicates high financial literacy. We use descriptive statistics to present our findings.


Table 2 shows relatively high levels of knowledge for many basic aspects of Social Security. For instance, more than 80 percent of respondents know of the availability of Social Security disability benefits, the adjustment of benefit amounts by claiming age, the option to wait after retirement to claim benefits, the funding of Social Security through payroll taxes, and the availability of benefits for the minor children of beneficiaries. Americans are less knowledgeable of some other aspects of Social Security, however. Relatively few individuals understand that Social Security benefits adjust with inflation and that spousal benefits may be available, including to a widow(er) with no children. Only one in five wave 1 respondents, when given a choice of answers, identified the way Social Security benefits are calculated.

Table 2. Levels of knowledge of selected Social Security program aspects in wave 1 (2015) and wave 2 (2017)
Aspect a Percentage correct in— Percentage-point change from wave 1 to wave 2
Wave 1 Wave 2
Disability benefits 88.1 93.5 5.4*
Age adjustment 84.2 91.7 7.5*
Claiming upon retirement 82.0 85.3 3.3
Payroll tax 81.4 85.7 4.3
Child survivor benefits 80.9 85.4 4.5
Spousal benefits 78.7 75.2 -3.5
Inflation adjustment 63.9 65.2 1.3
Widow(er) benefits 63.2 66.3 3.1
Benefit calculation 20.6 34.0 13.4*
Total 71.4 75.8 4.4*
SOURCE: Authors' calculations using UAS 16 and UAS 94 results.
NOTE: * = statistically significant at the 0.05 level.
a. See Box 1 for the survey question that measures knowledge of a given aspect.

For most Social Security program aspects, knowledge did not change significantly between survey waves. There were some notable exceptions, however. Knowledge of how benefits are calculated increased by 13.4 percentage points, knowledge of the age adjustment increased by 7.5 percentage points, and knowledge of the presence of disability benefits increased by 5.4 percentage points.

On average, respondents correctly answered 71.4 percent of all questions in wave 1 and 75.8 percent of all questions in wave 2. Chart 1 shows that individuals aged 25–35 had the lowest levels of knowledge in both waves 1 and 2, but exhibited a significant increase in knowledge between survey waves. The middle age group (36–54) also experienced a significant—although smaller—increase. Knowledge also increased, but not significantly so, among the group approaching retirement (ages 55–65).

Chart 1.
Social Security program knowledge, by age group, educational attainment, and financial literacy level: Average percentage of correct answers in wave 1 (2015) and percentage-point increase in wave 2 (2017)
Stacked bar chart linked to data in table format, which is provided in Table 3.
SOURCE: Authors' calculations using UAS 16 and UAS 94 results.
NOTE: * = statistically significant at the 0.05 level.

Within age groups, knowledge levels varied by education and financial literacy. Chart 1 shows that individuals with higher educational attainment had higher levels of Social Security knowledge and Table 3 shows that the differences by educational attainment are significant for all age groups and survey waves except for individuals approaching retirement (ages 55–65) in wave 2. Increases in knowledge between survey waves, however, were similar regardless of education. Among the youngest group, knowledge increased substantially, both for individuals who attended college and for those who did not. Yet, the knowledge gap by educational attainment remains by wave 2. Only for the group approaching retirement (ages 55–65) did individuals with no college close the knowledge gap with their college-educated peers by more than 1 percentage point (although still not significantly).

Table 3. Changes within age groups in Social Security program knowledge, by survey wave, educational attainment, and financial literacy level (average percentage of correct responses overall): 2015 and 2017
Age group and wave Total Educational attainment Financial literacy level
High school or less Some college or more Percentage-point difference Low High Percentage-point difference
Ages 25–35
Wave 1 61.4 49.0 68.1 19.1* 57.2 68.0 10.8
Wave 2 69.3 57.6 75.9 18.3* 64.1 78.6 14.5*
Percentage-point change 7.9* 8.6* 7.8* . . . 6.9* 10.6* . . .
Ages 36–54
Wave 1 70.2 65.6 73.6 8.0* 66.1 76.2 10.1*
Wave 2 74.6 69.3 78.2 8.9* 70.2 80.3 10.1*
Percentage-point change 4.4* 3.7* 4.6* . . . 4.1* 4.1* . . .
Ages 55–65
Wave 1 78.1 73.4 81.3 7.9* 71.7 81.7 10.0*
Wave 2 80.8 77.5 83.0 5.5 73.9 84.7 10.5*
Percentage-point change 2.7 4.1 1.7 . . . 2.2 3.0 . . .
SOURCE: Authors' calculations using UAS 16 and UAS 94 results.
NOTES: . . . = not applicable.
* = statistically significant at the 0.05 level.

Knowledge patterns by financial literacy largely mirrored those by education. With the exception of younger individuals (ages 25–35) in wave 1, differences by financial literacy were significant in all age groups. Increases in knowledge did not vary substantially by an individual's level of financial literacy. In no age group did those with less financial literacy close the knowledge gap between waves. For the youngest group (ages 25–35), the difference in knowledge was not significant in wave 1, but became significant by wave 2.

One possible explanation for the relatively strong growth in program knowledge among young adults is that they are first encountering basic aspects of Social Security (such as the availability of disability benefits or the funding of the program through a payroll tax) that are more widely known by older individuals. Individuals may tend to learn less widely understood program aspects (such as spousal benefits) later. Evidence for this theory is limited, however, as Table 4 suggests. The youngest age group (25–35) shows the smallest increase in knowledge of the least understood concept (the benefit calculation). On nearly every other program aspect, however, the younger individuals exhibit an increase in knowledge similar to or larger than that of the older age groups. In all cases, differences between age groups for individual survey items were statistically significant.

Table 4. Change in levels of knowledge of selected Social Security program aspects between wave 1 (2015) and wave 2 (2017), by age group (in percentage points)
Aspect a Overall 25–35 36–54 55–65
Disability benefits 5.4* 10.8 3.9 4.6
Age adjustment 7.5* 6.1 8.5 6.8
Claiming upon retirement 3.3 12.4 2.7 -0.7
Payroll tax 4.3 6.7 4.5 3.5
Child survivor benefits 4.5 12.5 5.2 -0.8
Spousal benefits -3.5 -2.4 -3.4 -5.2
Inflation adjustment 1.3 4.7 -2.7 3.9
Widow(er) benefits 3.1 8.8 6.2 -3.5
Benefit calculation 13.4* 10.3 13.1 15.1
SOURCE: Authors' calculations using UAS 16 and UAS 94 results.
NOTES: All differences between age groups are statistically significant at the 0.05 level.
* = statistically significant at the 0.05 level.
a. See Box 1 for the survey question that measures knowledge of a given aspect.

Discussion, Limitations, and Future Research

For more than two decades, SSA has worked to gauge public knowledge of its retirement and disability programs. Investing in the UAS' longitudinal survey of program knowledge is the most recent of these efforts.

Our study is the first attempt to use longitudinal program-knowledge data to build on existing research and provide improved insights on what people know and how that knowledge changes over time. We measure Social Security program knowledge at different time points and examine differences by age, educational attainment, and level of financial literacy. We find that, for example, knowledge increases most among young individuals (ages 25–35). This finding echoes research by Smith and Couch (2014), who emphasize the importance of targeting informational outreach efforts to younger workers.

Although knowledge about Social Security increases among young individuals of varying levels of educational attainment and financial literacy, their knowledge still lags significantly relative to that of older individuals. Our findings suggest that young individuals with no postsecondary education or low levels of financial literacy are potential targets for informational interventions. For such individuals, who are more likely to rely predominantly on Social Security benefits for their future retirement income, these interventions could prove particularly important.

Our study faces a number of limitations. One challenge is that testing itself may affect the statistical validity of the findings. That is, knowledge may increase simply because panel members complete the survey multiple times and not because of agency outreach or by comparatively organic means such as learning from employers, peers, or family members. However, because panel members do not receive the correct answers upon completing the survey and only take the survey every 2 years, the potential learning effect is minimal.

Another limitation is that this study does not identify the means by which respondents learned program aspects. Future research could identify which factors and processes drive knowledge gains.

The availability and use of longitudinal program-knowledge data from the UAS will continue to expand. Chard, Rogofsky, and Yoong (2017) used UAS data to develop a sophisticated measure of Social Security program knowledge and retirement preparedness, which researchers may use in future studies to measure changes in knowledge and preparedness over time. In addition, SSA researchers are conducting RCTs to evaluate the effectiveness of alternative communications to improve program knowledge—especially on topics for which knowledge has consistently been low, such as the retirement earnings test and the effects of choosing a retirement claiming age. The use of RCTs in combination with longitudinal survey data on program knowledge can guide agency efforts to better inform the public about its programs.


1 The Statement emphasizes that the benefit projections assume the continuation of the terms of the Social Security Act as currently amended and that future legislation can change those terms.

2 Initial Statement mailings went only to recipients in targeted age groups. The agency phased in wider mailings over several years. Not until fiscal year 2000 did all adults aged 25 or older receive a Statement.

3 To measure program knowledge in the HRS, researchers have generally tested whether an individual's expected Social Security benefit matches projections based on their earnings history. A reasonable alignment of expected and projected benefit amounts is deemed to signal a high level of program knowledge.

4 Along with the Internet-based survey, the researchers conducted a parallel telephone survey.

5 At the time of analysis, the UAS panel was a representative sample of approximately 6,000 U.S. households.

6 For more information on the UAS, see Alattar, Messel, and Rogofsky (2018).

7 Fielding a survey costs $3.00 per respondent per survey minute for the first 500 respondents, $2.50 for the next 500 respondents, and $2.00 for all additional respondents. Postproject services, including data delivery and documentation, cost an additional $2,000. Thus, a 15-minute survey administered to 1,000 respondents would cost approximately $43,250. More information on the pricing of survey administration is available at

8 The UAS panel consists of 21 sampling batches. This article uses data from the ASDE 2014-01 Nationally Representative sample, the initial sampling batch. The program-knowledge surveys for subsequent batches in wave 2 either remain in the field or are yet to be administered. More information about each sampling batch is in the “Methodology: Response and Attrition” section of the UAS website (

9 Because wave 2 is still in the field, the final response rate should exceed 77.9 percent.

10 We also find no significant differences by age in conjunction with other characteristics except that non-Hispanic black respondents aged 55–65 were more likely than members of other race/ethnicity groups in that age range to complete both survey waves (not shown). Because changes in program knowledge between waves did not vary by race/ethnicity, however, that difference should not bias the results.

11 We used only these two broad categories because using additional categories would have raised sample-size concerns.


Alattar, Laith, Matt Messel, and David Rogofsky. 2018. “An Introduction to the Understanding America Study Internet Panel.” Social Security Bulletin 78(2): 13–28.

Armour, Philip, and Michael Lovenheim. 2016. “The Effect of Social Security Information on the Labor Supply and Savings of Older Americans.” Michigan Retirement Research Center Working Paper No. 2016-361. Ann Arbor, MI: MRRC.

Biggs, Andrew G. 2010. “Improving the Social Security Statement.” Financial Literacy Center Working Paper No. WR-794-SSA. Santa Monica, CA: RAND Corporation.

Chard, Richard, David Rogofsky, and Joanne Yoong. 2017. “Wealthy or Wise: How Knowledge Influences Retirement Savings Behavior.” Journal of Behavioral and Social Sciences (4): 164–180.

Cook, Fay Lomax, Lawrence R. Jacobs, and Dukhong Kim. 2010. “Trusting What You Know: Information, Knowledge, and Confidence in Social Security.” The Journal of Politics 72(2): 397–412.

DeWitt, Larry. 1996. “Origins of the Three-Legged Stool Metaphor for Social Security.” Research Notes and Special Studies from the Historian's Office, Note No. 1. Woodlawn, MD: SSA.

Greenwald, Mathew, Arie Kapteyn, Olivia S. Mitchell, and Lisa Schneider. 2010. “What Do People Know About Social Security?” Working Paper No. WR-792-SSA. Santa Monica, CA: RAND Corporation.

Gustman, Alan L., and Thomas L. Steinmeier. 1999. “What People Don't Know About Their Pensions and Social Security: An Analysis Using Linked Data from the Health and Retirement Study.” NBER Working Paper No. 7368. Cambridge, MA: National Bureau of Economic Research.

Liebman, Jeffrey B., and Erzo F. P. Luttmer. 2015. “Would People Behave Differently If They Better Understood Social Security? Evidence from a Field Experiment.” American Economic Journal: Economic Policy 7(1): 275–299.

Lusardi, Annamaria, and Olivia S. Mitchell. 2017. “How Ordinary Consumers Make Complex Economic Decisions: Financial Literacy and Retirement Readiness.” Quarterly Journal of Finance 7(3).

Mastrobuoni, Giovanni. 2011. “The Role of Information for Retirement Behavior: Evidence Based on the Stepwise Introduction of the Social Security Statement.” Journal of Public Economics 95(7): 913–925.

Rohwedder, Susan, and Arthur van Soest. 2006. “The Impact of Misperceptions About Social Security on Saving and Well-Being.” Michigan Retirement Research Center Working Paper No. 2006-118. Ann Arbor, MI: MRRC.

Smith, Barbara A., and Kenneth A. Couch. 2014. “How Effective Is the Social Security Statement? Informing Younger Workers About Social Security.” Social Security Bulletin 74(4): 1–19.

[SSA] Social Security Administration. 2016. Income of the Population 55 or Older, 2014. Publication No. 13-11871. Washington, DC: SSA.