SSA Special Reports

     
 
Tab B

TECHNICAL APPENDIX

I. Sampling

Overview

The Office of Quality Assurance and Performance Assessment (OQA) selects the 800 Number Customer Survey samples from automatic number identifier (ANI) detail data supplied by AT&T. Sample calls represent individuals who were connected to the 800 number service. These individuals may have either spoken to a teleservice representative (TSR), been put in queue and hung up before speaking to a TSR, or used the automated network prompt/voice capture services. OQA selects samples twice each week to minimize the time between the sample call and the survey interview. For ease of survey administration, sample calls are separated into "east" and "west" groupings, depending on where in the United States the call originated.

OQA electronically processes the data records for these completed calls and randomly selects the calls for the initial sample. Computer listings of the sample calls are transmitted to Management Research and Planning, Inc. (MRP), the private sector company selected to administer the survey. Using the questionnaire and a randomly sequenced listing of sample telephone numbers provided by OQA, MRP employees call the originating telephone number and attempt to identify and reach the person who placed the sample call. When the original caller is reached, MRP attempts to secure his or her agreement to proceed with the survey interview. The responses from successfully contacted individuals who participate in the survey are stored in a data base for analysis and reporting.

AT&T ANI Detail Data Sampling Methodology

The ANI detail data, which serve as the basis for OQA's sample selection, contain the following information:

  • Date of the call;
  • Originating telephone number, including area code;
  • Length of the call in seconds;
  • Hour the call originated, in military time, eastern time zone;
  • Disposition of attempt to present call to SSA's network (i.e., completed transfer, busy, blocked at SSA network or other);
  • SSA teleservice center communications node to which initial transfer was made; and
  • Type of automated service selected.

OQA computer programs process the ANI data to exclude any nonsample period calls and calls from originating telephone numbers from which more than four completed calls were made on any day during the sample week. (Experience has shown that the vast majority of these telephone numbers belong to businesses for which it is virtually impossible to identify a specific caller.) For the remaining calls, the number of completed calls from each originating telephone number are tabulated, and the local hour of the day is determined by using the time zone for the area code to adjust the recorded (eastern) time. Counts of eligible calls are accumulated nationally. An OQA-designed random number generator program, based on selected digits of the current date and time, is used to generate random numbers which, in conjunction with the targeted sample size and records remaining, are used to select the initial sample of calls.

Final Sample and Universe Determination

The sample size is computed based on the number of responses which would be required to assure that a satisfaction rating of 90 percent would have a sampling variability of +2 percent at the 95-percent confidence level.

II. Survey Methodology

Blaise, a commercial computer-assisted telephone interviewing software product, is used for survey administration. The survey questions are displayed on each interviewer's personal computer screen, and as callers' responses to questions are entered, they are simultaneously stored in the Blaise data base. Since August 1997, MRP has administered the survey by telephone using the Blaise questionnaire developed by OQA. Using the sample listings, MRP employees call the originating telephone numbers and attempt to reach the person who placed the sampled call. To maximize the response rate, OQA requires MRP to call every telephone number on the sample listing and make at least 15 attempts over a 10-day period, alternating between morning, afternoon and evening hours.

Each sample call is either completed or excluded. The survey instrument provides codes for calls that are excluded from the study. MRP transmits the Blaise data base to OQA as each of the sample listings is completed, creating a national data base. The responses from successfully contacted individuals who participate in the survey comprise the data base used for analysis and reporting. Details on completion/exclusion rates for the current survey period are included under Part IV. Completion and Exclusion Rates.

Beginning in February 1999, the survey has used a redesigned questionnaire that reflects the purpose of interaction tracking under SSA's new Market Measurement Program. Interaction tracking is intended to assist SSA in monitoring and reporting on customer service performance. The questionnaire focuses on customers' ratings of service using a six-point, "world-class" rating scale.

III. Analysis and Reporting

OQA staff weight the data received from MRP by sample listing to reflect the number of calls that reached the 800 number during that sample period. The weights for completed survey calls are calculated by dividing the number of completed survey calls into the number of calls reaching the 800 number in each sample period. These weights are applied to each survey call prior to tabulation.

The weighted survey data are compiled and analyzed, and the findings are presented in a report of national findings. The narrative answers to open-ended questions are analyzed by OQA staff in central office.

Statistically Significant Differences

Tests for statistical significance are used in analyzing the results of statistical surveys to determine if differences between responses are "real" or if they could be due to chance. The calculation takes into account two main variables-the percentage-point difference between two findings and the corresponding number of responses. (Not all questions apply to all respondents to a particular survey, so the number of responses can vary depending on the survey question.) It is possible for a small difference in one comparison to be statistically significant, while a greater percentage difference in another comparison may not be not statistically significant because the number of responses is relatively small.

This is illustrated by the following example:

In an earlier survey of 800 number callers, respondents were asked whether the representative was able to handle the matter completely. Of the 1,074 respondents to this question, 80 percent answered "yes." In the previous survey, 76 percent of the 1,152 respondents to that question had given the same answer. The 4 percentage-point difference was statistically significant. However, of the 21 callers who answered a question about why it was hard to understand the recorded message, 40 percent said the recording gave too many options or went too fast. Of the 9 customers who answered this question in the prior survey, 76 percent had answered it the same way. Even though there was a 36 percentage-point difference, it was not statistically significant.

The influencing variable was the number of respondents in each calculation. The number of callers who answered the question about complete call handling by the representative was considerably greater than the number of customers answering the question about why it was hard to understand the recorded message.

IV. Completion and Exclusion Rates

For the February 1999 survey, MRP employees successfully completed interviews with 1,220 out of 2,597 sample callers. The remaining sample calls were not completed for the following reasons:

Nonresponders

  • Three hundred sixty-seven callers did not wish to participate or deliberately exited the interview prematurely.

  • For 262 telephone numbers, the person contacted at the sample telephone number said he or she had not called the 800 number and did not know who had.

  • Thirteen calls originated from an organization (e.g., a hospital) that requested information about more than one person.
  • Nineteen calls were excluded for some other reason.

Out of Scope

  • Six hundred nineteen calls originated from a public or business telephone (since, in most cases, callers who placed a specific call from such numbers cannot be identified).

  • Ninety-seven callers could not be reached because the telephone at the originating telephone number was disconnected at the time of the survey.

We calculated the response rate by eliminating from the sample the 619 calls from business or public telephones plus the 97 callers who could not be reached because the telephone number was disconnected. The effective sample size becomes 1,881 (2,597 - 716), and the resulting response rate is 65 percent (1220 / 1881).