Evaluation 2020-04-09T12:47:18+00:00

AMSTAT’s Expertise

AMSTAT provides fast and reliable credential evaluation services for numerous local, state, and federal government entities, schools, community, non-profit organizations and business clients. Hundreds of thousands of successful clients cite these reasons for choosing to work with AMSTAT:

  • All of our principals have Ph.D. at leading universities including Harvard, Stanford, and Columbia.
  • They include nationally renowned evaluation experts.
  • They have over 100 years of practical experience in evaluation and performance measurement.
  • Our fees usually pale in comparison to the savings and/or additional profits that our work produces for our clients.
  • Over 90% of our clients request our assistance more than once, as our clients are almost universally happy with our different brand of evaluation.
  • We are more reasonably priced than most other companies offering health evaluations.
  • We offer personalized, comprehensive, and friendly support during and after your consultation with us.
  • We offer ultra-fast turnaround times.

Our Services

AMSTAT can conduct several types of evaluations. Some of them include the following:

Formative Evaluation


  • Ensure that a program or program activity is feasible, appropriate, and acceptable before it is fully implemented.
  • Conduct it when a new program or activity is being developed or when an existing one is being adapted or modified.
  • Examine whether the proposed program elements are likely to be needed, understood, and accepted by the population you want to reach.
  • Examine the extent to which an evaluation is possible, based on the goals and objectives.
  • Allow for modifications to be made to the plan before full implementation begins.
  • Maximize the likelihood that the program will succeed.

Process Evaluation


  • Determine whether program activities have been implemented as intended.
  • Evaluate how well the program is working, the extent to which the program is being implemented as designed, and whether the program is accessible and acceptable to its target population.
  • Provide an early warning for any problems that may occur.
  • Allow programs to monitor how well their program plans and activities are working.
  • Track program information related to Who, What, When and Where questions:
    • To whom did you direct program efforts?
    • What has your program done?
    • When did your program activities take place?
    • Where did your program activities take place?
    • What are the barriers/facilitators to implementation of program activities?

Outcome Evaluation


  • Measure the degree to which the program is having an effect on the target population’s behaviors.
  • Tell whether the program is being effective in meeting it’s objectives.
  • Some questions we may address with an outcome evaluation include:
    • Were medical providers who received intensive STD training more likely to effectively counsel, screen and treat patients than those who did not?
    • Did the implementation of STD counseling in community-based organizations result in changes in knowledge, attitudes, and skills among the members of the target population?
    • Did the program have any unintended (beneficial or adverse) effects on the target population(s)?
    • Do the benefits of the STD activity justify a continued allocation of resources?

Economic Evaluation


  • Examine what resources are being used in a program and their costs (direct and indirect) compared to outcomes.
  • Provide program managers and funders a way to assess cost relative to effects. “How much bang for your buck.”

Impact Evaluation


  • Examine the degree to which the program meets its ultimate goal.
  • Provide evidence for use in policy and funding decisions.

We are happy to provide the help you need at any or all of the following steps in earning the answers you request:

  • developing a detailed program evaluation management plan;
  • reviewing program documents and records;
  • conducting pre-program development (Needs Assessment) evaluation;
  • conducting implementation (Process or Formative) evaluation;
  • conducting impact (Performance, Outcome or Summative) evaluation;
  • conducting site and classroom observations;
  • conducting focus group interviews;
  • conducting surveys;
  • leading site evaluation team meetings;
  • inputting, organizing, and cleaning the data;
  • implementing data analysis;
  • developing written evaluation reports;
  • presenting evaluation results to governing board

AMSTAT is dedicated to offering the following professional services:

  • Independent program evaluation services for grants
  • Performance measurement for corporations
  • Government contracting activities
  • Statistical and other types of quantitative analysis and support
  • Other types of administrative functions

Program evaluation is a tool with which to demonstrate accountability to an array of stakeholders who may include funding sources, policymakers, state, and local agencies implementing the program, and community leaders. AMSTAT can:

  • Engage stakeholders
  • Describe the program
  • Focus the evaluation
  • Gather credible evidence
  • Justify conclusions
  • Ensure the use of evaluation findings and share lessons learned

Personalize experiences with machine learning and predictive analytics software.

Devising the Sampling Plan

We are experts in devising the sampling plan. We can choose the various options such as:

Probability Sampling

  • Simple Random Sampling
  • Stratified Random Sampling
  • Cluster Sampling

Nonprobability Sampling

  • Quota Sampling
  • Purposive Sampling
  • Snowball Sampling

Specifying the Survey

We are experts in specifying the initial survey based on your research interest. We can choose the various options of scales such as:

  • Nominal Scales
  • Ordinal Scales
  • Interval Scales
  • Ratio Scales

Developing the interview

Before an interview takes place, respondents should be informed about the study details and given assurance about ethical principles, such as anonymity and confidentiality. This gives respondents some idea of what to expect from the interview, increases the likelihood of honesty and is also a fundamental aspect of the informed consent process. Wherever possible, interviews should be conducted in areas free from distractions and at times and locations that are most suitable for participants. For many this may be at their own home in the evenings. Whilst researchers may have less control over the home environment, familiarity may help the respondent to relax and result in a more productive interview. Establishing rapport with participants prior to the interview is also important as this can also have a positive effect on the subsequent development of the interview.

Advanced analytics optimizes your supply chain.

We can design and perform the required statistical analyses.  Here is a sample of some of the analytical tools with which we are familiar:

  • Correlation Analysis, T-test, Chi-square Test, Regression Analysis, Logistic Regression, Hierarchical Regression Analysis, Factor Analysis, Principal Components Analysis, One-way ANOVA/ANCOVA, One-way MANOVA/MANCOVA, Factorial ANOVA/ANCOVA, Repeated Measures ANOVA/ANCOVA, Repeated Measures MANOVA/MANCOVA, Nonparametric Test (Wilcoxon signed rank test, Friedman test, Kruskal-Wallis test, Mann-Whitney test, Spearman Rank Correlation), Structural Equation Modeling, Multilevel Structural Equation Modeling, Confirmatory Factor Analysis, Multilevel Confirmatory Factor Analysis, Exploratory Factor Analysis, Mediation Analysis, Moderation Analysis, Time Series Analysis, Spatial Time-series Modeling, Cluster Analysis, Cox Regression, Kaplan-Meier Survival Analysis, Trend Analysis, Sensitivity Analysis, Hierarchical Linear Modeling (HLM), Bayesian Analysis, Bayesian Cox Regression, Joint Hierarchical Bayesian Modeling, Latent Class Analysis, Longitudinal Growth Modeling, Mixed-Effects Regression Model, Meta-Analysis, Mixture Model, Linear Mixed Models, Predictive Modeling, Distribution Analysis (e.g., Lognormal, Weibull, Gamma), Decision Tree, Ensemble Analysis, De-identification, Interim Analysis, Item Analysis, Discriminant Analysis, Binomial Test, Heterogeneity Test, Multidimensional Scaling, Tau-U Analysis, and Advanced Statistical Analysis

We have expertise in virtually every statistical and qualitative software package, including but not limited to:

  • SAS, SPSS. Stata. HLM, Mplus, R, SPSS Amos, SPSS Modeler, Azure, Access, WinBUGS, Minitab, Match!3, Zview

We are dedicated to helping our clients analyze their qualitative data utilizing a number of different methodologies.


Meta-Analysis is a methodology employed to synthesize the outcomes of various studies related to the same topic or outcome measure. We can achieve the integration of research findings through qualitative means. Qualitative meta-analysis follows the same replicable procedures of a quantitative meta-analysis; however, it is interpretive rather than aggregative. It is critical to define the domain of research and establish criteria for including studies in the review.


Standardized as a phenomenological research methodology by Moustakas (1994), the modified van Kaam (1959) method involves understanding the essence, meaning, and structure of an individual’s lived experiences. AMSTAT Consulting uses this methodology to look for patterns and trends by identifying shared beliefs.

Analytic Induction

We are experts in observing events, stating a hypothesis, observing additional events, and evaluating whether the new observations follow the hypothesis. We can ease the process for anyone who uses analytic induction.

Logical or Matrix Analysis

Logical or matrix analysis emphasizes the processes of reasoning while analytic induction relies on observations. Thus, how the answer is found is as important as what the answer actually is.

Performance measurement and improvement are systematic processes by which an organization continuously and consistently tracks and applies important program and operations data for the purpose of optimizing its ability to efficiently and effectively advance its desired social impact. The most powerful performance measurement systems are typically a core responsibility of an organization’s own staff, who integrate program, financial and organizational data to measure an organization’s progress and success.

AMSTAT can conduct performance measurement to enable an organization to continuously learn and improve, which helps it to achieve better results. The metrics tracked should be derived from an organization’s intended impact and theory of change—what the organization is holding itself accountable for achieving and how to get there. By measuring performance, we can:

  • Track progress towards and be held accountable for their intended impact and theory of change
  • Ensure programs or initiatives are implemented as designed
  • Learn about ways to achieve even better results by analyzing insights
  • Communicate progress and successes internally and externally to staff, beneficiaries, funders, peer organizations, and the broader community
  • Over time, gain insights about program effectiveness and what works and, if appropriate, prepare for rigorous program evaluations

There are five key steps to a strong performance measurement and improvement process. We can:

  • Define: Clarify your definition of success and the critical questions or decisions that performance measurement will help inform.
    • Crystallize your intended impact and theory of change.
    • Derive clear, prioritized metrics from these that cover program, financial, and organizational data.
    • Design and develop a data system and data-collection process to systematically gather and analyze this data at appropriate intervals.
  • Measure: Collect information, verify and validate it, and track it in the data system.
  • Learn: Analyze data and generate reports to identify insights and propose ways to improve.
  • Improve: Decide which improvements will strengthen the organization’s success and begin implementing them.
  • Share: Decide how what you have learned may help others in your organization and in the greater field, and share your metrics and results.

Good leadership and management are essential to organizational development, performance, and sustainability. An organization succeeds because of what it does (a shared commitment to accomplish something useful and important) and how it does it (the way it functions, decides, evaluates, adapts, and delegates).

Factors determining how an organization does its work or accomplishes its objectives:

  • Effectiveness and functioning of individuals at all levels of the organization
  • Management systems supporting their work
  • Organizational culture
  • Adequacy of human and financial resources

Organizational performance always includes some element of customer satisfaction. One evaluates what an organization does in relation to the goals and objectives it has established. Evaluators should, therefore, define measures or indicators in relation to the specific long- and short-range objectives set by the organization, many of which are presented in this database. We can:

  • Use a standard set of criteria based on national or international norms and standards
  • Develop indicators for each management area and component in collaboration with the organization as part of an exercise to review and strengthen its management systems
    • This approach involves an assessment to determine the baseline stage of development of the organization.

Use big data to make informed decisions

Evaluation encourages us to examine the operations of a program, including which activities take place, who conducts the activities, and who is reached as a result. In addition, evaluation will show how faithfully the program adheres to implementation protocols. Through program evaluation, we can determine whether activities are implemented as planned and identify program strengths, weaknesses, and areas for improvement.

AMSTAT provides a variety of specialized evaluation services in health. They include:

  • Tobacco prevention, evidence-based medicine, clinical practice guidelines, children and family services, healthcare ethics, biostatistical studies, clinical trial design and methodology, medical education, biostatistical consulting, clinical program evaluation, performance measurementm, and process improvement

Our leaders

Dr. David Fetterman
Dr. David FettermanAdvisory Board (Fetterman & Associates, President)

Ph.D., Stanford University
Master’s Degree, Stanford University
Master’s Degree, Stanford University


Stanford University, Professor
School of Medicine, Stanford University, Director of Evaluation

HONORS (selected)

American Educational Research Association Research on Evaluation Distinguished Scholar Award, 2013
Lazarsfield Award for Contributions to Evaluation Theory, American Evaluation Association, 2000
Mensa Education and Research Foundation Award for Excellence, 1990.
Myrdal Award for Cumulative Contributions to Evaluation Practice, American Evaluation Association, 1995
Outstanding Higher Education Professional, Neag School of Education, University of Connecticut, 2008
Who’s Who in America, 1990, 1995-1996, 1999, 2008-2012

PROJECTS (selected)

$15 Million Digital Divide Project, Hewlett-Packard Philanthropy and Education
W. K. Kellogg Foundation
Case Method, Columbia School of Journalism
Digital Media Center, Knight Foundation
Arkansas Department of Education
One East Palo Alto. City Revitalization Project Hewlett Foundation

BOOKS (selected – over 100)

Fetterman, D.M., Rodriguez-Campos, L., and Zukoski, A. (2017). Collaborative, Participatory, and Empowerment Evaluation: Stakeholder Involvement Approaches. New York: Guilford Publications.
Fetterman, D.M., Kaftarian, S., and Wandersman, A. (2015). Empowerment Evaluation: Knowledge and Tools for Self-assessment, Evaluation Capacity Building, and Accountability. Thousand Oaks, CA: Sage.
Fetterman, D.M. (2013). Empowerment Evaluation in the Digital Villages: Hewlett-Packard’s $15 Million Race Toward Social Justice. Stanford: Stanford University Press.
Fetterman, D.M. (2010). Ethnography: Step by Step (Third Edition). Thousand Oaks, CA: Sage.

CHAPTERS AND ARTICLES (selected – over 100)

Fetterman, D.M. (in press). Empowerment Evaluation. The SAGE Encyclopedia of Educational Research, Measurement, and Evaluation. Thousand Oaks, CA: Sage.
Mansh, M., White, W., Gee-Tong, L., Lunn, M., Obedin-Maliver, J., Stewart, L., Goldsmith, E., Brenman, S., Tran, E., Wells, M., Fetterman, D.M., Garcia, G. (2015). Sexual and Gender Minority Identity Disclosure During Undergraduate Medical Education: “In the Closet” in Medical School. Academic Medicine, 90(5),634-644.
Wang JY, Lin H, Lewis PY, Fetterman DM, Gesundheit N. (2015). Is a career in medicine the right choice? The impact of a physician shadowing program on undergraduate premedical students. Acad Med. May, 90(5),629-33.
White, W., Brenman, S., Paradis, E., Goldsmith, E.S., Lunn, M.R., Obedin-Maliver, J., Stewart, L., Tran, E., Wells, M., Chamberlain, L.J., Fetterman, D.M., and Garcia, G. (2015). Lesbian, Gay, Bisexual, and Transgender Patient Care: Medical Students’ Preparedness and Comfort. Teaching and Learning in Medicine: An International Journal. 27(30), 254-263


Empowerment Evaluation in the Digital Villages (book), KAZI FM, Houston, Texas, March 29, 2013.
Chronicle of Philanthropy article about evaluation and nonprofit survival (Chronicle), WPFM FM, Washington, D.C. March 25, 2013.
Empowerment Evaluation in the Digital Villages (book), Kathryn Zox Show, March 13, 2013.
Empowerment Evaluation in the Digital Villages (book), Money Matters Network, Host Stu Taylor, January 28, 2013.
Empowerment Evaluation in the Digital Villages (book), WKXL-AM, Concord, New Hampshire, Host Bill Kearney, January 17, 2013.
Empowerment Evaluation in the Digital Villages (book), WPHM-AM Detroit, Host Paul Miller, January 14, 2013.
Empowerment Evaluation in the Digital Villages (book), Business Matters Radio, Host Thomas White, January 14, 2013.

ENCYCLOPEDIA (selected): The International Encyclopedia of Education and Encyclopedia of Human Intelligence

Dr. Ann E. K. Um
Dr. Ann E. K. UmPresident and CEO

Doctorate Degree, Columbia University
Master’s Degree, Stanford University
Master’s Degree, Columbia University


Harvard Medical School, DFCI, Research Data Manager
Harvard Medical School, Brigham and Women’s Hospital, Data Manager
The University of Texas, Assistant Professor
Harvard University, Harvard Innovation Labs, Experfy, Instructor
Johns Hopkins Hospital, Advisor
United States Environmental Protection Agency, Human Studies Review Board


Autonomy Support, Self-Concept, and Mathematics Performance: A Structural Equation Analysis. Saarbrucken, Germany: VDM Verlag, 2010.
Motivation and Mathematics Achievement: A Structural Equation Analysis, Saarbrucken. Saarbrucken, Germany: VDM Verlag, 2008.
Motivation and Mathematics Performance: A Structural Equation Analysis. Michigan, Ann Arbor: ProQuest, 2006.
Motivation and Mathematics Performance: A Structural Equation Analysis (doctoral dissertation). Columbia University, New York, 2005.


Motivation and Mathematics Performance: A Structural Equation Analysis, National Council on Measurement in Education, Montreal, Quebec, Canada, 2005.
Comparing Eighth Grade Diagnostic Test Results for Korean and American Students, National Council on Measurement in Education, Chicago, Illinois, 2003.

learn more about our PhD level experts

This list shows a sample of the past clients.

explore more quantitative studies
explore more qualitative studies

Read the latest posts

Leadership Behavior

March 23rd, 2020|2 Comments

The purpose of this qualitative cross-sectional case study was to examine leadership behavior and successful integration of technology in high schools. The research question was: RQ1: What leadership behavior supports successful integration of technology [...]

Relative Risk of Lung Cancer 

March 23rd, 2020|2 Comments

The purpose of this quantitative study was to compare the difference in relative risk (RR) of lung cancer between the intervention and control groups. The hypothesis was: H1: The intervention group will have an increase in relative [...]


March 20th, 2020|0 Comments

The purpose of the qualitative phenomenological study was to examine how patients diagnosed with cancer can cope with stress. The research question was: RQ1: How can patients diagnosed with cancer cope with stress? We [...]

Explore the blog posts: Quantitative Research
explore the blog posts: Qualitative Research

“We have been very pleased with working with AMSTAT. The service was custom tailored and on time completion. The statistical report was detailed with excellent graphics. The cost of the services was affordable for a start-up company such as EndoLogic! Dr. Ann is very detail oriented and likes to know the project thoroughly that is being analyzed.”

Dr. Zamir S. Brelvi, MD, PhD., CEO, EndoLogic

“Dr. Ann has been instrumental in helping with our statistical needs. In addition to her professionalism, she has been prompt and thorough with all of our requests. Dr. Ann’s work is impeccable, and I would recommend her services to anyone in need of assistance with statistical methods or interpretation. We plan on using Dr. Ann for all of our future needs, and I am thrilled to have been introduced to her.”

Dr. Raj Singhal, MD., Research Director of Pain Management, Phoenix Children's Hospital

“I have worked closely with AMSTAT on the data analysis/results of two research projects so feel as though I am knowledgeable about their expertise. On all accounts, the company provided me with reliable statistical analysis and results that I could translate into publishable format. They are conscientious experts who provide keen insights into appropriate statistical analysis given various data sets. I highly recommend them for your statistical support needs!”

Dr. Vincent Salyers, Dean and Professor, Gonzaga University

“I am a physician and was in need of statistical analysis of research data. I found AMSTAT on online search. Dr. Ann called me and explained the process involved in data analysis. Dr. Ann was always very prompt, helpful, intelligent and took time explaining the various tests used in conducting data analysis. Thank you so much!! I look forward to working with you in the future.”

Dr. Haritha Boppana, MD, DHA, Prisma Health Greenville Memorial Hospital

“Extremely professional. Attends to your project needs with skills and expertise. Pays attention to all details. Offers suggestions and recommendation for better and more effective use of your data. Creative and sincere. Thank you very much.”

Prof. Mohamed Toufic El Hussein, Mount Royal University

“My project required the analysis of a complex survey that required a great deal of help in organizing the data and analyses. In addition, the project required a quick turn-around. AMSTAT asked all the right questions, made realistic and helpful suggestions, and completed the project in a timely manner. They were professional and helpful throughout the process. I highly recommend them.”

Dr. Nancy Allen, PhD., Curriculum and Technology Consultant
Check out our 5-star reviews

Get $200 Credit by Calling Us Now At (301) 800-0038

Join The 300,000+ Satisfied Customers Today

If You Want To Set Up A Specific Time To Discuss Your Project, Schedule A Free Consultation! 

Contact us now!
Schedule a Free Consultation