Social Security Administration, Office of Policy: 2001 Customer Satisfaction Survey, Final Report
December 21, 2001
Table of Contents
- Executive Summary
- Customer Satisfaction
- Respondent Interest Areas
- Respondent Recommendations
- Respondent Types
- Survey Results
- The Sample and Respondents
- Customer Satisfaction
- Satisfaction with Statistical Tables and Analytical Articles or Reports
- Satisfaction with Specific SSA Publications
- Satisfaction with Overall Quality
- Satisfaction with Overall Quality by Sample Group
- Satisfaction with SSA's Performance on New Issues
- Satisfaction with SSA's Performance on New Issues by Sample Group
- Respondent Interests
- Broad Interest Areas
- Interest in Topics and Subgroups
- Respondent's Priority Research and Policy Issues
- Interest in Public-Use Data Files
- Use of Specific SSA Publications
- Respondents' Recommendations for Improvement
- Recommendations for Improvement
- Reasons for Not Using SSA's Information
- Respondent Types
- Work Affiliation
- Satisfaction by SSA Employment
- Years of Professional Interest
- Frequency of Use
- Source of Information
- Appendix A: Survey Methodology
- Survey Design
- Questionnaire Design and Pretests
- Samples Frame and Selection
- Data Collection
- Data File Preparation and Analysis
- Survey Design
- Appendix B: Survey Questionnaire
Over the past few years, the Social Security Administration (SSA) has made a concerted effort to strengthen its research, statistical and policy analysis functions. Many of these functions have been consolidated in a reorganized and expanded Office of Policy. This report presents results from a survey of customer satisfaction among those using the products and services of the Office of Policy. Specifically, the goals for this survey were: To determine, among decisionmakers and others involved in Social Security and related issues, (1) the extent to which SSA's research, statistical, and policy analysis work was focusing on topics and issues of widespread concern, including new and emerging issues, and (2) the extent to which these research and analysis products were eliciting high levels of satisfaction from those who made use of them.
Eighty-six percent of respondents said they were somewhat or very satisfied with the overall quality of SSA's research, statistical, or policy information in the preceding 24 months. Thirty-seven percent were very satisfied. Current subscribers to SSA publications reported the highest levels of overall satisfaction compared to other sample groups.
In regard to specific attributes of both statistical tables and analytical articles, customers were most satisfied with accuracy (almost 90% very or somewhat satisfied), while ease of finding information was rated lowest (about 70% very or somewhat satisfied). Overall, statistical tables received slightly higher ratings than analytical articles. In terms of specific publications, respondents gave the highest ratings to the Social Security Bulletin (83% very or somewhat satisfied) and the lowest to Social Security Programs Throughout the World (64% very or somewhat satisfied).
Another attribute of customer satisfaction is SSA's performance in keeping up with new and emerging research and policy issues. Respondents did not give as high ratings on this measure as they did on overall satisfaction: 62% said they were very or somewhat satisfied; 18% said they were very satisfied. Consistent with other findings, Subscribers gave higher ratings than other sample groups on SSA's performance in keeping up with new and emerging issues.
Respondent Interest Areas
Respondents indicated somewhat more interest in Social Security issues than in SSI and somewhat more interest in disability than in retirement issues. From a list of topics of interest, the three mentioned most by respondents were current programs or proposals for change, economic well being of the aged or disabled, and work-related issues (e.g., disability and work). The three subgroups of interest mentioned most by respondents were disabled adults, women, and low-wage workers.
The top research or policy issues that respondents think SSA should be working on in the next year or two are Social Security reform and privatization, financing issues and solvency, and work incentives. In regard to administrative data, respondents are most interested in public use data files that have SSA data linked to surveys. Publications most widely used by the respondents were the Social Security Bulletin, the Annual Statistical Supplement, and Fast Facts and Figures about Social Security.
Only about two-fifths of respondents offered recommendations for improving Social Security's research, statistical, or policy products and services. The most common recommendation was for additional data or analysis in a particular topic area. Next most frequent were recommendations for improving the dissemination of information, including SSA's website, and for increasing the clarity of information.
Nearly one-fourth of respondents worked for the federal government and another one-fifth were employed in higher education. Almost half had been interested in SSA related issues for more than 20 years. Of the respondents who used SSA information, almost half said they had received or sought SSA information more than 10 times in the preceding 24 months. Also of the respondents who used SSA information, about 90% got the information from published or hard copy materials and two-thirds from SSA web sites.
Over the past few years, the Social Security Administration (SSA) has made a concerted effort to strengthen its research, statistical and policy analysis functions, recognizing that information generated through these activities is essential to the development of sound and effective policies and programs. Many of these research and policy functions have been consolidated in a reorganized and expanded Office of Policy (OP). This empirical and analytical information is provided to decisionmakers, both inside and outside of SSA, but it is also disseminated to members of the broader research/policy community who are concerned with social security and income maintenance issues. Among other venues, dissemination occurs through regular publications, postings on the Internet, briefings and other presentations, and responses to information requests.
In order to evaluate the perceived quality of SSA's research, statistical, and policy analysis work and to promote further improvements, SSA's FY 2000 Performance Plan included the development of a customer satisfaction survey to be conducted in FY 2001. The goals for the survey were as follows: To determine, among decisionmakers and others involved in Social Security and related issues, (1) the extent to which SSA's research, statistical, and policy analysis work was focusing on topics and issues of widespread concern, including new and emerging issues, and (2) the extent to which these research and analysis products were eliciting high levels of satisfaction from those who made use of them—specifically, in terms of such attributes as accuracy, comprehensiveness, and responsiveness. Information gained from the survey will be used to guide the agency's efforts for further improvements in its research, statistical, and policy analysis products and services.
Essentially, this report contains two main sections: one on customer satisfaction data and the other on respondent interests and recommendations. The first section of the report outlines customer satisfaction results in several different categories. Satisfaction was asked in terms of SSA's statistical tables and analytical reports, specific SSA publications, overall quality of information, and SSA's performance in working on new and emerging research and policy issues. In addition, some analyses are reported by sample group.
The second section of the report addresses additional respondent issues. The section begins with a discussion of respondents' research and policy interests, including particular topics and subgroups, and priority issues for SSA to address in the next year or two. The presentation then describes recommendations made by respondents for improving SSA's research, statistical, and policy analysis products and services.
A brief final section analyzes types of respondents who were very satisfied with these products and services, and appendices provide details on the survey's methodology and a facsimile of the mailed questionnaire.
The Sample and Respondents
The sample for this survey came from four different sources. Eighteen hundred cases were sampled from across four unduplicated lists. The first sample group (Decisionmakers) included 59 individuals in high-level positions in SSA and related Federal agencies. The second sample group (Subscribers) consisted of 885 individual subscribers to a cross section of SSA-OP publications. The third sample group (Non-Subscribers) included 512 cases from a series of lists including attendees at various research and policy-related conferences, as well as members of the National Academy of Social Insurance. The fourth sample group, from a database of stakeholders maintained by SSA's Office of Communication (OComm), included 344 individuals who had indicated an interest in receiving SSA's statistical publications but were not current subscribers.
The table below shows the total number in the sample and the corresponding number of respondents from each of the four sample sources who completed the survey and who identified themselves as actively interested in research, statistics, or policy issues relevant to SSA or SSI. There were also three active completes whose sample group could not be identified. Further information about sampling and response rates can be found in Appendix A: Survey Methodology.
|Sample Source||Number in
In addition to the 913 active completed questionnaires, 130 respondents indicated that they were inactive in this research or policy area and were therefore ineligible for most of the survey's questions. The final response rate for this survey, including active and inactive completes, was 58%.
Included in the 913 active completes are 73 respondents who said they were not users of SSA information in the two years preceding the survey. These non-users were included in questions about interests and recommendations, but were not asked the questions about customer satisfaction. The table below shows that the non-users were distributed fairly evenly across three of the sample sources.
|Sample Source||Number of Non-Users
among the Active Completes
Satisfaction With Statistical Tables And Analytical Articles Or Reports. Over three-quarters of respondents reported that they received information from SSA in the form of either statistical tables or analytical articles and reports.
In addition to these two forms specified in the questionnaire, some respondents said they received the information in another format. Of those who did receive information in another format, the most commonly mentioned was verbal reports, presentations, and discussions.
The respondents who answered that they received information in the form of statistical tables or analytical articles and reports were asked several follow up questions to measure their satisfaction with various attributes of the information. These attributes included accuracy, clarity, comprehensiveness, objectivity, how up to date the information was, how useful it was, and how easy it was to find the information the respondent was looking for. The charts below summarize satisfaction with these attributes. Accuracy had the highest ratings for both statistical tables and analytical articles while ease of finding information had the lowest ratings.
From these measures we created three index scores that collapse the answers for individual items. The first index score is the average of the seven "very satisfied" ratings for statistical tables; the second index is a comparable average for analytical articles and reports; and the third index score is a simple average of the first two scores, indicating the average percent very satisfied with both types of products combined (it should be noted that adjusting these calculations for slight differences in the numbers of product and attribute ratings had no meaningful impact on the averages). As shown in the chart below, respondents were more likely to be very satisfied with the statistical tables than with the analytical articles.
We created another index score that averages the "very satisfied" ratings for each of the seven attributes across the two product types (analytical articles and statistical tables). The index scores are shown below. The highest index scores were for accuracy (59.3%) and the lowest for ease in locating the information (25.4%).
Satisfaction With Specific SSA Publications. Overall, a majority of respondents were satisfied with the publications they were asked to evaluate. The publications are listed below along with the number of people who rated each publication. Respondents gave the highest ratings to the Social Security Bulletin; 83.3% were very or somewhat satisfied. The lowest ratings were for Social Security Programs Throughout the World; 64.0% were very or somewhat satisfied. Although Fast Facts and Figures About Social Security was number three in terms of the number of respondents who were either very or somewhat satisfied, it had the highest number of very satisfied respondents (44.1%).
We also created an index score to collapse the ratings of the various publications in Question 18. It is a weighted average of the "very satisfied" ratings for the ten publications (based on the number of responses received for each publication). The average score was 36.6%, slightly lower than the average for statistical tables (43.3%) but on par with analytical articles (36.9%).
Satisfaction With Overall Quality. Eighty-six percent of respondents said they were either very or somewhat satisfied with the overall quality of SSA's research, statistical, or policy information in the previous 24 months. However, more respondents said they were somewhat satisfied (48.8%) than said they were very satisfied (37.2%).
Satisfaction With Overall Quality By Sample Group. There were some differences in overall satisfaction by sample group. Group differences were small for the combined ratings of very and somewhat satisfied together, but somewhat larger when restricted to very satisfied. Compared to the other sample groups, more Subscribers said they were very satisfied (42.0%) and slightly more said they were either very or somewhat satisfied (88.2%). Non-Subscribers had the lowest rating on the combined measure (82.4%), although respondents in the OComm sample had the fewest number of respondents saying they were very satisfied (23.6%). Decision makers landed in the middle of the satisfaction pack, coming in second for the combined overall score and third for the very satisfied rating.
Satisfaction With SSA's Performance On New Issues. Respondents were also asked how satisfied they were with SSA's performance in identifying and working on new and emerging research and policy issues. Among those who expressed an opinion, respondents were not as satisfied as they were with the overall quality of SSA's current research. While almost two-thirds of respondents said they were very or somewhat satisfied with SSA's performance in this area, less than 20% said they were very satisfied.
Satisfaction With SSA's Performance On New Issues By Sample Group. There were modest differences in satisfaction levels among the four sample groups on this question as well. Subscribers reported the highest levels of satisfaction; 66.2% said they were very or somewhat satisfied. Respondents from the OComm list reported the lowest levels; 52.7% said they were very or somewhat satisfied.
Broad Interest Areas. A set of four questions asked respondents about their interest areas. The first set asked about interests in the broad topic areas of Social Security and SSI, retirement and disability. Respondents were somewhat more likely to be interested in Social Security issues than in SSI, and somewhat more interested in disability than in retirement issues. However, larger groups were interested in both Social Security and SSI (54.5%) and both retirement and disability issues (39.0%).
Interest In Topics And Subgroups. Respondents were then asked more specifically about their interest in lists of topic areas and subgroups. More than half of respondents indicated that they were interested in current programs or proposals for change, economic well-being of the aged or disabled, work-related issues (e.g., disability and work, the retirement process), economic impact of Social Security or SSI programs, Social Security financing issues, and other government income security programs in the U.S. The top write-in response was Medicare/Medicaid and health-related issues.
The top five topic areas for each of the four sample groups were no different than the top topic areas for the population as a whole—they were even in the same order of importance with the exception of Decisionmakers. For Decisionmakers, current programs or proposals for change was the most commonly mentioned response to Question 6 (97.8%) but work-related issues and the economic impact of SS or SSI programs were tied for second place (88.9%), while Social Security financing issues were more important to Decisionmakers (86.7%) than the topic area economic well-being of the aged or disabled (80.0%).
Question 7 asked respondents about their interests in a list of specific population subgroups. Over one-half of respondents indicated that they were interested in disabled adults, women, low wage workers, racial or ethnic minorities, and disabled children.
The top five subgroups of interest were the same across all four sample groups.
Respondents' Priority Research And Policy Issues. Respondents were then asked what were the most important research or policy issues they thought SSA should be working on in the next year or two. The question was open-ended, and 23% of respondents didn't offer an opinion. Among those who did, the top response was Social Security reform/proposals, which was mentioned by 20.2% of respondents. More than 10% of respondents also mentioned financing/solvency/trust fund issues (14.7%), work incentive issues (14.4%), and issues concerning specific policies or provisions (12.8%).
The responses to Question 9 did not differ greatly across the four sample groups. The top five most important research and policy areas were the same in the different sample groups as in the total sample with two exceptions. For Non-Subscribers, aged and retirement issues were more frequently mentioned than SS reform/proposals. For Decisionmakers, issues related to disabled children and youth were the fourth most common response.
Interest In Public-Use Data Files. Respondents were also asked if they would have any interest in having access to public-use data files about various topics. The highest number of respondents (72.9%) were interested in SSA data that has been linked to surveys. Over half of all respondents also expressed an interest in data on characteristics of Social Security beneficiaries (68.4%) and SSI recipients (62.4%).
Use Of Specific SSA Publications. Another way to measure the interest areas of respondents is to see which publications they said they used over the past 24 months in Question 18. Most of the publications listed were used by over half of the respondents.
Use of the publications did not differ much across the four sample groups. All groups listed the same publications as their top five.
Respondents' Recommendations for Improvement
Recommendations For Improvement. All respondents were asked an open-ended question about recommendations they had for improving Social Security's products and services. Forty-three percent did not respond, while an additional 16% said they had no recommendations. The most common recommendation was for additional data or analysis in particular topic areas (12%).
Among the small group of Decisionmakers, only 38% offered recommendations for improvements; four of the top five recommendations were similar to those from respondents as a whole although the rank order is somewhat different. The recommendation made most frequently by Decisionmakers (13%) was to improve the timeliness of research, statistical or policy products.
Reasons For Not Using SSA's Information. Another question that can be used to look for recommendations is Question 12. This question was asked of respondents who indicated active interest in the field but said they did not receive any research, statistical, or policy information from SSA publications, SSA web sites, or SSA staff in the preceding two years. The question asked these respondents if there was any particular reason they had not used SSA's information. Forty-one percent did not respond, and an additional 14% said there was no particular reason. The reason mentioned most frequently (18%) was that SSA's information was not related to the respondent's particular needs.
Respondents were also asked several demographic or usage questions such as where they worked, how long they had been interested in Social Security related issues, how frequently they received or sought SSA information, and how they received that information.
Work Affiliation. The first question asked respondents to answer in what general category of organization they worked. Almost one-fourth (23.3%) of respondents worked in the executive branch of the federal government. Another 20.9% worked at a university or college.
Respondents with no work affiliation (for example, retirees or consultants) were most likely to say they were "very satisfied" with the overall quality of SSA's information in Question 19. Respondents who worked at nonprofit service organizations gave the lowest ratings for overall quality.
Respondents with no work affiliation (39.0%) were also most likely to be "very satisfied" with SSA's identification and work on new and emerging research and policy issues (Q8). Respondents in Congressional agencies (8.3%) were least likely to respond that they were "very satisfied."
Satisfaction By SSA Employment. SSA employees had only slightly higher satisfaction levels with overall quality than did non-SSA employees (80.4% vs. 78.8% very or somewhat satisfied).
SSA employees were also slightly more likely to be satisfied with SSA's performance in identifying and working on new and emerging research and policy issues than were non-SSA employees (59.2% vs. 54.0% very or somewhat satisfied).
Years of Professional Interest. Respondents generally had quite a long interest in SSA related issues. Almost half (45.5%) had an interest for more than twenty years and fewer than 10% had an interest for less than five years.
Overall satisfaction was somewhat higher among respondents who had been interested in SSA related issues for more than twenty years (40.6% very satisfied).
There was no clear relationship between the length of time the respondent had been interested in SSA related issues and satisfaction with SSA's identification and work on new and emerging research and policy issues. Respondents who said they were interested in SSA related issues for five to nine years and more than twenty years were the most likely to be very satisfied with overall quality (around 19.5%); respondents who were interested in SSA related issues for between ten and twenty years were the least likely to be very satisfied (14.8%).
Frequency of Use. Of the respondents who identified themselves as users of SSA's information, a slight majority (53.2%) received or sought the information fewer than ten times in the preceding twenty-four months. About one-quarter reported use more than 20 times, including about 10 percent using more than 50 times.
Not surprisingly, respondents who had received or sought information from SSA more than 50 times in the past 24 months were the most likely to say they were very satisfied with the overall quality of SSA information.
Frequency of use also seems to be related to respondent satisfaction with SSA's performance in identifying and working on new and emerging research and policy issues. Respondents who had received or sought the information twenty times or more were more likely to say they were very satisfied than less frequent users.
Source of Information. Almost 90% of respondents said they got information from SSA through published or hard copy materials. Two-thirds got information from SSA's web sites.
Appendix A: Survey Methodology
Questionnaire Design And Pretests. The questionnaire was developed from an initial draft by SSA-OP. It was designed to focus on two critical components of customer satisfaction. The first part of the questionnaire was designed to measure whether individuals active in this field thought that SSA was producing research and policy information on topics of interest and importance to them. The second part of the questionnaire focused on users of SSA's research, statistical, or policy information within the two years preceding the survey, and their level of satisfaction with that information.
The questionnaire was pretested three different ways. First, Gallup conducted a focus group of nine users of OP's products to talk about the concepts in the questionnaire and to discover how they evaluated OP's products and services. Second, after the questionnaire was drafted, Gallup researchers interviewed four SSA employees to get their feedback on the questionnaire. Finally, Gallup interviewers conducted separate telephone interviews with nine OP customers, alternating between two slightly different versions of the questionnaire.
Sample Frame And Selection. Given the subject matter of this survey—research, statistical, and policy information from SSA—it was impossible to operationally define the population of customers or intended customers in a comprehensive manner. Clearly, the population would include subscribers to SSA-OP's publications. But the population would also include non-subscribers who are active in this area of research/policy and who make use of SSA materials in libraries and over the internet, who contact SSA with requests for information, or who receive information through SSA presentations and briefings. This population could not be fully identified. In addition, there was no obvious list for another segment of the target population, "Decisionmakers," as mentioned in SSA's Performance Plan.
After exploring the availability and utility of various lists and sources of names, OP's survey team, in consultation with Gallup and others, developed four lists that seemed a reasonable approach to a sampling frame for this first OP Customer Satisfaction Survey.
The first of these lists was composed of a small group of Decisionmakers, individuals in high-level positions in SSA and related federal agencies who are presumed to have enhanced potential for influencing policy decisions. This list of 59 names was prepared by OP's executive staff.
A second list consisted of individual Subscribers to a cross-section of SSA-OP publications. Specifically, those publications included: 1) Social Security Bulletin, 2) ORES Working Papers, 3) Social Security Programs Throughout the World, and 4) SSI reports, including SSI Disabled Workers and WIN Provisions, Children Receiving SSI, and SSI Annual Statistical Report. There were 885 non-duplicative names on this list.
The third list, Non-Subscribers who were known to have some active interest in this research/policy area, was more difficult to construct. Various possibilities were explored before the list was compiled, primarily from the following sources: members of the National Academy of Social Insurance, attendees at recent conferences of the Retirement Research Consortium (a consortium of researchers whose work is funded by SSA), and participants at recent conferences sponsored by several other research or policy organizations. For the latter, an effort was made to select organizations reflecting a broad spectrum of political philosophies. Many of these individuals were on more than one of these lists, and others were already included in the Decisionmakers or Subscribers lists. A non-duplicative count of this segment of the target population yielded 1,192 names.
Finally, SSA-OP obtained additional non-subscribers from a list recently developed by SSA's Office of Communications (OComm). Based on input from SSA's field offices throughout the country, this OComm "stakeholder" list identifies individuals at the community level who have expressed professional interest in SSA's policies and programs. (Many of these individuals are, for example, state or local government workers in social services or health programs.) Among specific areas of interest, a subset of these persons had identified themselves as interested in "statistical publications" from SSA. The non-duplicative count of these individuals was 802.
Certain individuals were excluded from these lists: employees in SSA-OP, persons living outside the U.S., and individuals who had participated in the focus group or pretest, described above. Members of Congress and congressional staffs were also excluded from the sample for the 2001 survey. In the early stages of planning this survey, it was learned that interviews and focus group sessions with key congressional staffers were being conducted by SSA-OComm during 2000-2001 in an effort to evaluate SSA's services to Congress, including the provision of statistical and policy analysis information. To avoid duplication of effort and undue respondent burden, SSA decided not to request Congressional participation in OP's 2001 customer satisfaction survey.
The sequence of the four sampling lists described above and the exclusion of duplicate listings in each successive list reflected our decisions about different levels of importance and choices for a sample design.
First, since we intended to reach some conclusions about how well SSA is serving Decisionmakers (as suggested in the Performance Plan), a decision was made to include 100% of this small group in the survey sample. The second group (Subscribers) was also considered of primary importance to the survey's objectives. In order to develop a reliable baseline measure of customer satisfaction from the survey, it was essential to obtain a reasonably large number of responses from recent "users" of SSA's research and policy information. Of the four sampling lists, it seemed that the Subscriber group would likely include the highest proportion of these users. Thus, we sampled 100% of this group.
Among the third and fourth groups (Non-Subscribers who are actively interested in this research/policy area and the OComm list), we expected to find a number of users, but were also concerned about the interests and needs of intended customers who may not have made recent use of SSA's products and services. This important but secondary objective of the survey, along with the combined size of these two groups (roughly twice the size of the Subscriber group), suggested a lower sampling ratio for them.
There were two additional considerations in choosing the sampling ratios for these two groups and completing the sample design. First, while we did not plan to calculate statistical estimates (e.g., variances, confidence intervals) based on this imperfect specification of the population, we considered it important to achieve a completed sample of at least 900 total cases to enable meaningful subgroup analyses. Second, while Gallup planned to make every effort to achieve an 80% response rate, we also thought it was prudent to consider the possibility of a somewhat lower rate.
With these considerations in mind and with 944 respondents already included from the first two groups, a sampling ratio of .4293 was required for the third and fourth groups. The following table presents total numbers and numbers selected for the sample from each of the four sampling lists.
|Stratum/Sample Group||Number on
|4. OComm database||802||344|
The first step in data collection was to complete missing contact information for the selected sample. This was done in several stages. The first was for a Gallup contractor to match cases with a missing address or telephone number against a computerized database. The second was to have a Gallup staff member look up any cases that were still missing contact information. After the initial mailing of the questionnaire, Gallup and SSA-OP staff looked for contact information for any cases that were incomplete at that point and still missing telephone numbers.
The data collection schedule is listed in the table below. The first step was to mail the prenotification letter. This was followed by the survey packet, which included a cover letter, the survey questionnaire booklet, and a postage-paid return envelope. A reminder postcard was mailed a week after the survey packet. Telephone interviewers attempted to contact remaining non-respondents beginning two weeks after the reminder postcard was mailed and continuing for four more weeks.
|Prenotification Letter||June 25, 2001|
|Survey Packet Mailed||July 2, 2001|
|Reminder postcard Mailed||July 9, 2001|
|Telephone Interviewing||July 23 - August 24, 2001|
Once the questionnaires were collected, the results from the three modes of data collection were combined into one data file. The file was examined for duplicates and the duplicates were removed by taking the most complete case and filling missing items with complete information from the duplicate case if possible. Then the least complete cases were eliminated from the file. This second step removed three cases that had 12 or more of the 20 questionnaire items missing.
The table below shows the data collection results. There were 913 respondents who said they were active in the field of Social Security related research and policy, and another 130 respondents who said they were inactive in the field. Over half of the number of active completes were collected with the mail questionnaire.
|Complete - Mail||514||28.6|
|Complete - Web||104||5.8|
|Complete - Telephone||295||16.4|
|Complete - Inactive||130||7.2|
The table below shows the final response rates for the total sample and by sample group. There were three cases that were returned with no questionnaire ID. Therefore it is unknown which sample group these cases came from.
|Sample Group||Complete -
The final status of the 757 incompletes is listed below. "Other" includes respondents who say they are ill or had a death in the family, or say they already completed the survey through the mail or web site. A "non-target" number is when the respondent is not at the number given. There were 226 cases where contact had been made, but the interviewer needed to call back the respondent to complete the interview; among these callback numbers, all numbers had been called at least once, with an average number of calls being 8 and the maximum 18. The table below shows the final outcome data for the 757 people who did not participate:
|Final Sample Status||Frequency||Percent|
|Unlocatable - Mail||57||7.50|
|Unlocatable - Telephone||41||5.39|
|Non-working or disconnected number||66||8.68|
Data File Preparation and Analysis
After the initial steps in the data file preparation, the sample variables were added to the data file including: the sample source, whether or not the respondent is an SSA employee, which publication type the respondent subscribes to, and the ICS code variable (a code that identifies a small subset of cases that will be analyzed by SSA as part of a separate project on internal customer satisfaction).
The open-ended questions were coded after data collection. For questions where the verbatim response was an "other-specify" question, the response was examined to see if it fit into any of the existing response categories. If it did, it was added to the existing response category. Proper nouns, place names, expletives, and unintelligible words were removed from the verbatim response and replaced with a code such as (undecipherable). Up to three responses were coded for each open-ended response. The coding was completed in three stages: first by a Gallup coder, then a Gallup researcher reviewed the codes, and finally all verbatims and codes were sent to SSA-OP for review.
The next step in the data file preparation was to "clean" the data to make sure there were no inconsistent responses. For the data cleaning Gallup did the following:
- Checked the skip patterns in Questions 2, 11, 12, and 15. Our procedure is if a stem question indicates that the respondent should have skipped the follow up question but the follow up question is completed, we will change the answer to the stem question so that the respondent should not skip the follow up question. For example, if the respondent filled out any question in Questions 13-19 but also said "no" in Question 11 (the stem question), we changed Question 11 to "yes."
- Gave a special value to questions that a respondent would legitimately skip. For example if the respondent did not mark statistical tables in Question 15 they were coded with a special missing code for all of Question 16. This coding of "inapplicables" gave us accurate counts of the number actually missing for each question.
- For "mark all that apply" questions we created a separate variable for each response item that has a value of 2 if the respondent did not mark that item and 1 if the respondent did mark that item. If the respondent left the entire question blank, each item was left as missing.
- For "other specify" questions, if the respondent filled in the specify box but did not check the other box, we used the same one/two coding scheme mentioned above and assigned the other box a code of one.
Item nonresponse was generally below 5%. Exceptions are listed in the table below. Questions 9 and 20 were open-ended questions, which typically have higher rates of nonresponse. No imputation was done for any missing values.
No weighting was done on the sample because the population of users is not fully defined. SSA-OP continues to try to identify their target audience. However, with the public nature of their publications, there may never be a finite population list of those using the SSA materials.
Appendix B: Survey Questionnaire
CUSTOMER SATISFACTION SURVEY
The Social Security Administration (SSA) has asked The Gallup Organization to find out what you think about its research, statistical, and policy analysis efforts.
SSA produces a variety of statistical and analytical information about its programs and about related issues. The information is disseminated in various ways, including the Internet, through SSA staff, and in publications and reports (e.g., the Social Security Bulletin, the Annual Statistical Supplement).
The purpose of this survey is to help SSA improve these products and services. As you will see, some of the survey questions are directed to anyone with professional interests in this area, while other questions are restricted to those who have actually used some of the SSA's research, statistical, or policy information in recent years. Your opinions are still important to SSA and Gallup, even if you are a "non-user".
Please take a few minutes to fill out this questionnaire and send it back to us in the enclosed postage-paid envelope. We estimate that the survey will take about ten minutes to complete. Your answers will be kept strictly confidential, and will be merged with the responses of others like yourself.
If you wish to take this survey online, you may do so by logging into http://ssaop.gallup.com The code (PIN) you should use to access the survey is on the cover letter attached to this questionnaire.
Thank you again for your participation. We greatly appreciate your time and your help.
The PRIVACY ACT requires us to notify you that we are authorized to collect this information by Section 702 of the Social Security Act. You do NOT have to provide the information requested. However, the data you provide will help the Social Security Administration - Office of Policy to evaluate and improve its products and services. The Gallup Organization guarantees the confidentiality of every respondent.
The PAPERWORK REDUCTION ACT OF 1995 requires us to notify you that this information collection is in accordance with the clearance requirements of 44 U.S.C. § 3507, as amended by section 2 of the Paperwork Reduction Act of 1995. We may not conduct or sponsor, and you are not required to respond to, a collection of information unless it displays a valid OMB control number. We estimate that it will take you about 10 minutes to complete this questionnaire. This includes the time it will take you to read the instructions and fill out the questionnaire.
THANK YOU FOR PARTICIPATING IN THIS SURVEY.
PLEASE RETURN YOUR COMPLETED QUESTIONNAIRE IN THE ENCLOSED ENVELOPE.