TUPS Article Published

Ritzhaupt, A. D., Huggins-Manley, A. C., Dawson, K., Ağaçlı-Doğan, N., & Doğan, S. (2017). Validity and Appropriate Uses of the Revised Technology Uses and Perceptions Survey (TUPS). Journal of Research on Technology in Education, 49(1-2), 73-87. Retrieved from http://dx.doi.org/10.1080/15391523.2017.1289132

Abstract: The purpose of this article is to explore validity evidence and appropriate uses of the revised Technology Uses and Perceptions Survey (TUPS) designed to measure in-service teacher perspectives about technology integration in K–12 schools and classrooms. The revised TUPS measures 10 domains, including Access and Support; Preparation of Technology Use; Perceptions of Professional Development; Perceptions of Technology Use; Confidence and Comfort Using Technology; Technology Integration; Teacher Use of Technology; Student Use of Technology; Perceived Technology Skills; and Technology Usefulness. We first provide a review of the literature supporting the design of the revised TUPS. We collected data from N = 1,376 teachers from one medium-sized school district in the state of Florida and conducted a variety of psychometric analyses. We performed internal structure analysis, correlation analysis, and factor analysis with these data. The results demonstrate that data collected from the TUPS are best used as descriptive, granular information about reported behaviors and perceptions related to technology, rather than treated as a series of 10 scales. These findings have implications for the appropriate uses of the TUPS. (Keywords: technology integration, K-12 teachers, survey, validity, reliability)

Nursing Simulations Article Published

Kleinheksel, A. J. & Ritzhaupt, A. D. (2017). Measuring the adoption and integration of virtual patient simulations in nursing education: An exploratory factor analysis. Computers & Education, 108, 11 – 29.

Abstract: This study sought to develop a valid and reliable instrument to identify the characteristics of computer-based, interactive, and asynchronous virtual patient simulations that nurse educators identify as important for adoption, and the subsequent curricular integration strategies they employed. Once these factors were identified, this study also sought to explore any relationships between the influential features for adoption and the ways in which the adopted virtual patients are integrated. Data were collected with the Virtual Patient Adoption and Integration in Nursing (VPAIN) survey, which was completed by 178 nurse educators who were currently using, or had previously used virtual patient simulations. Both exploratory factor analysis and correlation analysis were conducted. Through exploratory factor analysis, 55.6% of the variance in the VPAIN adoption subscale data was accounted for by the nine adoption factors identified: Trustworthiness, Worldbuilding, Pedagogy, Differentiation, Encouragement, Clarity, Evaluation, Administrative Pressure, and Visibility. The factor analysis also identified five factors within the integration subscale, which accounted for 53.3% of the variance: Hour Replacement, Intensive Integration, Leveling, Preparation, and Benchmarking. A correlation analysis was conducted to identify relationships between the adoption and integration factors.

Games and Simulations Article Published

Ritzhaupt, A. D., Poling, N., Frey, C., Kang, Y. & Johnson, M. (2016). A phenomenological study of games, simulations, and virtual environments courses: What are we teaching and how? International Journal of Gaming and Computer-Mediated Simulations, 8(3), 59 – 73.

Abstract: Educational technology programs from across the United States are offering graduate courses in games, simulations, and virtual environments (GSVE) to their students. However, these courses, until now, have not been systematically studied. This research uses a hermeneutical phenomenological approach to answer the research question: “How do instructors describe their experience teaching GSVE courses?” Five professors of educational technology that have taught GSVE courses were interviewed using a semi-structured protocol based on the TPACK (Technological Pedagogical Content Knowledge) framework. These data were analyzed both analytically and thematically. The results of the study showed a wide variety of topics, tools, and pedagogies are used within GSVE courses. The results had five themes emerge: Focus on Application and Theory, Experiential Learning and Constructivism, Instructor’s Prior Experience with Games, Heterogeneous Student Populations, and Range of Technology Tools. These themes as well as these courses are highlighted within this paper. A discussion is provided.

TPACK Article Published

Ritzhaupt, A. D., Huggins, A. C., Madley, S., Ruggles, K. & Wilson, M. (2016). Validation of the Survey of Preservice Teachers’ Knowledge of Teaching and Technology: A multi-institutional sample. Journal of Digital Learning in Teacher Education, 32(1), 26 – 37.

Abstract: The TPACK (technological pedagogical content knowledge) framework (Mishra & Koehler, 2006) has gained tremendous momentum from within the educational technology commu- nity. Specifically, much discourse has focused on how to measure this multidimensional  construct to  further define the contours of the framework and potentially make some meaningful predictions. Some have proposed observation scales while other have proposed  self-report  measures  to gauge the phenomenon. The Survey of Pre-service Teachers ’ Knowledge of Teaching and Technology instrument is one popular tool designed to measure  TPACK  (Schmidt  et al., 2009)  specifically  from  preservice teachers in teacher education programs. This study extends the measurement framework by providing a confirmatory factor analysis of the theoretical   model   proposed   by Schmidt et al. (2009) on a sample of 227 preservice teachers from four public institutions of higher education  in  the  southeastern  United States. The data did not fit the theoretical 10-factor model implied by Schmidt  et al.  (2009),  thus,  an exploratory factor analysis was conducted  to  determine  the  optimal structure of the measurement tool for these data. This resulted in a nine- factor model, and there were measurement issues for several of the constructs. Additionally, the article provides evidence of external validity by correlating the instrument scores with   other   known   technology constructs.

Instructional Designer Article Published

Ritzhaupt, A. D. & Kumar, S. (2015). Knowledge and skills needed by instructional designers in higher education. Performance Improvement Quarterly, 28(3), 51 – 69.

Abstract: In this paper, we sought to address  the following research question: What  knowledge and skills are needed by  instructional designers in higher educa- tion to be successful in their roles? We  interviewed eight instructional design- ers from across the United States, all  working for institutions of higher edu- cation. Using the constant comparative  method, we analyzed our data to iden- tify relevant themes. Our results suggest  that instructional designers in higher  education must have a solid founda- tion in instructional design and learning  theory, possess soft skills and technical  skills, and have a willingness to learn  on the job. Most instructional design- ers felt their academic backgrounds  assisted them with their job roles, and,  in particular, valued their professional  experiences. Instructional designers  in higher education must also keep  abreast of multiple emerging informa- tion and communication technologies.  We provide a discussion to synthesize  our fi  ndings.  The  fi  ndings are relevant  to professionals, professional academic  programs, and professional associations.

1 2 3