Research Instruments

Over the years, I have designed, developed, and validated a few research instruments on a wide range of constructs. I provide a few of these instruments for research purposes. All instruments are freely available to use for research purposes provided proper attribution is provided in any published works. I also provide the citations to the articles that utilize these instruments. Please contact me if you have any questions or concerns.

Electronic Portfolio Student Perspective Instrument (EPSPI)

The Electronic Portfolio Student Perspective Instrument (EPSPI) is designed to measure student perspectives about ePortfolios related to five different factors: visibility, employment, assessment, learning, and support. The EPSPI is based on a conceptual framework which is visualized in the figure below. The original EPSPI contains 43 unique items and is presented in a modified Likert scale (Strongly Disagree; Disagree; Neither Agree, Nor Disagree; Agree; and Strongly Agree). After the exploratory factor analysis, the visibility factor consumed the employment items, and all negatively stated items were removed from the scale. The current form of the EPSPI is based on a series of studies leading to the design, development, and ultimately, validation of the survey tool.
ePortfolio conceptual framework

References

Ritzhaupt, A. D., Ndoye, A., & Parker, M. A. (2010). Validation of the Electronic Portfolio Student Perspective Instrument (EPSPI): Conditions under a different integration initiative. Journal of Computing in Teacher Education, 26(3), 111–119.

Ritzhaupt, A. D., Singh, O., Seyferth, T., & Dedrick, R. F. (2008). Development of the Electronic Portfolio Student Perspective Instrument: An ePortfolio integration initiative. Journal of Computing in Higher Education, 19(2), 47–71.

Ritzhaupt, A. D., & Singh, O. (2006). Student perspectives of ePortfolios in computing education. In R. Menezes (Ed.), Proceedings of the Association of Computing Machinery Southeast Conference (pp. 152–157). Melbourne, Florida: ACM.

Ritzhaupt, A. D., & Singh, O. (2006). Student perspectives of organizational uses of ePortfolios in higher education. In E. Pearson & P. Bohman (Eds.), Proceedings of ED-Media: World Conference on Educational Multimedia, Hypermedia, and Telecommunications (pp. 1717–1722). Orlando, FL: AACE.

Download Instrument: EPSPI in MS Word Format

e-Learner Satisfaction Scale (eLSS)

The eLSS is designed to be a parsimonious, flexible, objective, valid, and reliable tool for reporting learner satisfaction scores with self-paced e-learning environments for either research or practice. The tool uses a 9-item semantic differential scale format to collect data from learners. The figure below visualizes the conceptual framework for the eLSS in relation to the nine items on the survey along with both the positive and negative bipolar adjectives to gauge a learner’s level of satisfaction with a wide range of self-paced e-learning solutions. The nine items are specific enough to account for the learner’s perception of the e-learning experiences, and general enough to apply to virtually any type of self-paced e-learning situations (e.g., eBooks, online instructional videos, etc.). The adjectives selected for the scale were intended to positively correlate with each other and to operationalize into a single unidimensional construct of learning satisfactions with self-paced e-learning environments.

eLSS Conceptual Framework

References

Ritzhaupt, A. D. (In review). Measuring learner satisfaction in self-paced e-learning environments: Validation of the e-Learner Satisfaction Scale (eLSS). International Journal of E-Learning.

Download Instrument: eLSS in MS Word Format

Ideal Computing Professional Association Survey (ICPAS)

The design, development, and initial validation of the Ideal Computing Professional Association Survey (ICPAS) was performed to provide the preliminary validity and reliability evidence of the survey on professionals from a computing professional association. Recognition of the need to develop a system for effectively measuring factors that influence professional association membership served as a driver for this research. The ICPAS was developed based on a conceptual framework shown in the figure below, a focus group of computing professionals, review panel of experts, and was subsequently deployed within the Association of Information Technology Professionals (AITP). The ICPAS provides a reliable measurement system for computing professional association leadership to make informed decisions and to benchmakr professional needs and motivations. The final instrument has 52 items and uses a same modified Likert scale (Strongly Disagree; Disagree; Neither Agree, nor Disagree; Agree; Strongly Agree).

ICPAS conceptual framework

References

Ritzhaupt, A. D., Umapathy, K., & Jamba, L. (2008). Computing professional association membership: An exploration of membership needs and motivations. Journal of Information Systems Applied Research, 1(4), 1–23.

Ritzhaupt, A. D., Umapathy, K., & Jamba, L. (2012). A study on services motivating computing professional association membership. International Journal of Human Capital and Information Technology Professionals (IJHCITP), 3(1), 54–70.

Download Instrument: ICPAS in MS Word Format

Educational Technologist Competency Survey (ETCS)

The Educational Technology Competency Survey (ETCS) designed, developed, and validated was based on a conceptual framework that emphases the definition of educational technology and associated knowledge, skill, and ability statements (KSAs). Using the conceptual framework shown in the figure below, the ETCS was developed by conducting an: 1) an extant review of relevant literature related to competencies of educational technology professionals, 2) job announcement analysis of 400 postings from five relevant databases, 3) extraction and merging of the KSA statements from the job announcements and relevant literature, 4) administration of the survey on a wide variety of educational technology professionals. One hundred seventy-six KSA competencies were derived from the process and organized into KSA statements. The instruments were assigned the following response scale for the KSA statements: Not important at all (1); Important to a small extent (2); to some extent (3); to a moderate extent (4); and to a great extent (5). This response scale was adopted to gauge the relative importance of a competency from an educational technology professional’s perspective. The instructions for participants read “Please indicate the importance of the following (knowledge/skill/ability) statements in creating, using, and managing learning resources and processes.”

ETCS conceptual framework

References

Ritzhaupt, A. D., Martin, F., Pastore, R., & Kang, Y. (In press). Development and validation of the Educational Technologist Competencies Survey (ETCS): Knowledge, skills, and abilities. Journal of Computing in Higher Education.

Kang, Y., & Ritzhaupt, A. D. (2015). A job announcement analysis of educational technology professional positions: Knowledge, skills, and abilities. Journal of Educational Technology Systems, 43(3), 231–256.

Ritzhaupt, A. D., & Martin, F. (2014). Development and validation of the educational technologist multimedia competency survey. Educational Technology Research and Development, 62(1), 13–33.

Ritzhaupt, A. D., Martin, F., & Daniels, K. (2010). Multimedia competencies for an educational technologist: A survey of professionals and job announcement analysis. Journal of Educational Multimedia and Hypermedia, 19(4), 421–449.

Download Instrument: ETCS in MS Word Format