AMSTAT Consulting’s Four Areas of Expertise
AMSTAT Consulting is dedicated to providing data management and statistical analysis. AMSTAT Consulting provides expert consultation services to government agencies.
Thousands of successful clients cite these reasons for choosing to work with AMSTAT Consulting:
- All of our principals have PhD in statistics at leading universities including Harvard, Stanford, and Columbia.
- They include nationally renowned scholars who are also talented professors.
- They have extensive backgrounds in statistics and over 100 years of practical experience in quantitative and qualitative methods.
- They are experts in statistical analysis such as t-test, ANOVA, traditional regression, logistic regression, MANCOVA, factor analysis, cluster analysis, survival analysis, time series analysis, canonical correlations, discriminant analysis, and more advanced statistical techniques such as structural equation modeling (SEM), hierarchical linear modeling (HLM), and path analysis.
AM Stat Consulting’s Other Services
- establishing and operationalizing your hypotheses and research questions
- providing ample instruction on the methods used
- inputting, organizing, and cleaning the data
- implementing the statistical analyses
- testing reliability (such as Cronbach’s alpha, test-retest reliability, split-half reliability, and inter-rater reliability) and validity (such as content validity, construct validity, criterion validity, internal validity, and external validity)
- writing up all results, including APA tables and figures
- providing syntax and raw output file
- explaining the results
- allowing unlimited e-mail and phone support to ensure that you completely understand the results of the analysis
- conducting two rounds of incidental statistics (i.e., if you would like additional statistics)
- performing qualitative analysis
- preparing an effective PowerPoint Presentation
- supporting until project is complete.
AMSTAT Consulting is dedicated to detecting and correcting corrupt or inaccurate records from a record set, table, or database. The process of data cleaning includes data auditing, workflow specification, workflow execution, post-processing, and controlling.
We can use popular methods. Those include parsing, data transformation, duplicate elimination, and statistical methods. By analyzing the data using the values of mean, standard deviation, range, and clustering algorithms, we can find values that are unexpected and thus erroneous.
We can examine any standardized residual greater than about 3 in absolute value, Hat element greater than 3p/n (p=k+1, k degrees of freedom), a Cook’s distance > 1, and Mahalanobis distance for case. We run Outlier Analysis such as a run-sequence plot, a scatter plot, a histogram, and a box plot.
We can test reliability (such as Cronbach’s alpha, test-retest reliability, split-half reliability, and interrater reliability) and validity (such as content validity, construct validity, criterion validity, internal validity, and external validity).
We can design and perform the required statistical analyses. Here is a sample of some of the analytical tools with which we are familiar:
- Traditional Regression
- Logistic Regression
- Factor Analysis
- Cluster Analysis
- Time Series Analysis
- Survival Analysis
- Canonical Correlations
- Discriminant Analysis
- Structural Equation Modeling (SEM)
- Hierarchical Linear Modeling (HLM)
- Path Analysis
We have expertise with virtually every statistical and qualitative software package, including but not limited to:
We are dedicated to helping our clients analyze their qualitative data utilizing a number of different methodologies.
Meta-Analysis is a methodology employed to synthesize the outcomes of various studies related to the same topic or outcome measure. We can achieve an integration of research findings through qualitative means. Qualitative meta-analysis, also referred to as meta-synthesis, follows the same replicable procedures of a quantitative meta-analysis; however, it is interpretive rather than aggregative. It is critical to define the domain of research and establish criteria for including studies in the review.
Phenomenology has its roots in philosophy as presented by Edmund Husserl in the late 1800’s. Scholars applied phenomenology to the field of psychology around 1879 (Giorgi, 2009). Phenomenon is any experience that presents itself to consciousness, something that is present in human consciousness. There are two main principles in the phenomenological approach:
1. Husserl’s “principle of principles” that views experiences as legitimate; researchers should not try to add to, or subtract from what is presented by participants (Giorgi, 2009, p. 69).
2. The concept of “free imaginative variation” which requires removing an aspect of the phenomena to determine what is essential. If the removal results in substantially changing what we have, we can view it as essential (Giorgi, 2009, p. 69).
A case study approach is useful for exploring, describing, and explaining questions for research (Yin, 1994). In addition, a case study approach is useful for providing feedback to how or what questions (Yin, 2009). A multiple case study investigates a contemporary phenomenon within its real-life context (Yin, 2009, p. 37).
I have worked closely with AMSTAT Consulting on the data analysis/results of two research projects so feel as though I am knowledgeable about their expertise. On all accounts the company provided me with reliable statistical analysis and results that I could translate into publishable format. They are conscientious experts who provide keen insights into appropriate statistical analysis given various data sets. I highly recommend them for your statistical support needs!