tigerleo

August 29, 2008

Assignment 1, Research Methods – “Learning Styles” Annotated Bibliography

Filed under: CSG5140 Research Methods,ECU MInT — steve @ 7:09 pm

Abstract

This paper has been presented in the form of an annotated bibliography for the outcomes of investigation and research concerning the area of “learning styles in online education” for Assignment 1, CSG5140 Research Methods, Edith Cowan University. This paper has been compiled in three parts: [A] An evaluative annotated bibliography of five refereed journal articles pertaining to the use of learning styles in online education; [B] Further evaluation of the methods employed in one of those refereed articles; and, [C] Five additional articles relevant to the investigation undertaken in part B which have been added to the annotated bibliography described in part A. The five articles explored concern four differing learning style inventories plus one research methods knowledge base.

Part A & C:

CAPT. (2008). CAPT: Books, Research for MBTI, Archetypes, Leadership, Psychological Type. Center for Applications of Psychological Type. Retrieved August 23, 2008, from http://www.capt.org/

The Centre for Applications of Psychological Type (CAPT) located in Gainesville, Florida, USA, was established by Isabel Briggs Myers and Dr. Mary McCaulley in 1975 as an education centre for the training, consultation, publication and research of the Myers-Briggs Type Indicator (MBTI). The CAPT website serves as a portal for the general public to access information on CAPT’s products and services, assessment tools and tests, and information on the MBTI Instrument. Full contact details for CAPT were provided on Capt’s website.

The CAPT website provided a detailed overview of the MBTI Instrument including the history of its development by Isabel Briggs Myer which was based upon Carl Jung’s work on archetypes and his concept of the psychological type. The website included descriptions of: the 16 Myers-Briggs personality types; the type dynamics; reliability and validity of the MBTI Instrument; comparison to other psychological tests; code of ethics; frequently asked questions; and, a list of related organisations. Additional links provided explanations of the application of personality types for: the workplace; mind, body and spirit, or effectively stress; children and families; and, education. A link was provided to take the MBTI online however that was not a free service with one needing to book a session in order to receive the MBTI test and a one hour of feedback session by a qualified Myers-Briggs professional.

Navigational issues aside the CAPT website provided a useful background of the 16 Myers-Briggs personality types and their application to the understanding of one’s own self. The section on using personality types in education defined four learning centres or quadrants which would prove useful if one were to further explore the use of the MBTI Instrument as a basis for the identification of students’ online learning styles and needs.

 

ETL project. (n.d.). Introduction. ETL project. Retrieved August 23, 2008, from http://www.etl.tla.ed.ac.uk/index.html

The Enhancing Teaching-Learning Environments in Undergraduate Courses (ETL) project website detailed the research by and contact details of: Professor Dai Hounsell, Professor Noel Entwistle, Dr Charles Anderson, Dr Kate Day and Dr Velda McCune of the University of Edinburgh, Edinburgh, UK; Dr Adrian Bromage, Coventry University, Coventry, UK; Professor Ray Land, University of Strathclyde, Glasgow, UK; Professor Erik Meyer, University of Durham, Durham, UK; and, Dr Nicola Reimann, Northumbria University, Newcastle, UK; as well as other additional advisers and consultants from 2001 to 2005. The research purpose was to provide an understanding for educational departments involved in undergraduate teaching “new ways of encouraging high quality learning”.

The ETL project website provided links to: an overview of the ETL project’s purpose; concepts and research strategy; information on and contact details of the research team; a list of downloadable project reports and publications; and, links to the Teaching and Learning Research Program, and The Economic and Social Research Council websites which were presumed to be the initiators and funders of the project respectively. Downloadable ETL measurement instruments included the: Shortened Experiences of Teaching and Learning Questionnaire; Learning and Studying Questionnaire; Experiences of Teaching and Learning Questionnaire; and, Approaches and Study Skills Inventory for Students (ASSIST). Also provided was a bibliography of studies that had either used or were conceptually related to the Approaches to Studying Inventory, Revised Approaches to Studying Inventory and ASSIST.

Although one’s own research into learning styles will be targeted at primary school students while the ETL questionnaires were aimed at undergraduate students the resources offered on the ETL website appeared worthy of further exploration into learning approaches.

 

Felder, R. M. (2008). Richard Felder: Resources in Science and Engineering Education. Richard Felder’s Home Page. Retrieved August 23, 2008, from http://www4.ncsu.edu/unity/lockers/users/f/felder/public/RMF.html

Dr. Richard M. Felder’s website, linked off North Carolina State University’s domain, was aimed at providing educators with guidance, techniques and knowledge to make teaching more effective. The website stated that Dr. Felder’s title is that of “Hoechst Celanese Professor Emeritus of Chemical Engineering at North Carolina State University” which was readily validated by searching for that exact title in the Google search engine.

Felder’s homepage provided many links to numerous articles, papers, publications and studies that Felder has produced or co-authored over a period of approximately 35 years or more. Included were links to web pages listing his works in Education, Learning Styles, Teaching/Learning, Random Thoughts, Workshops, Handouts, Tutorials and Related Papers. Of particular relevance to this bibliography entry Felder provided on his website resources for the use of the Felder-Silverman Learning Style model in the Felder-Soloman Index of Learning Styles on-line instrument which may be used free of charge for no-commercial purposes. Felder also provided downloadable documents of his publications concerning other learning style models such as Learning Approaches, Myers-Briggs, Kolb and the Herrmann Brain Dominance Instrument.

The sheer volume of Felder’s work presented on his website was beyond the scope of this bibliography entry to comment on in full detail. One feels that given one’s own need to explore learning styles for future research endeavours then Felder’s site would provide an invaluable source of information into not only the Felder-Solomon Index of Learning Styles instrument but also for further exploration of existing studies into the use of other learning style models. Certainly worthy of further exploration as time allows.

 

Kolb, A., & Kolb D. A. (2008). Experience Based Learning Systems, Inc. – Devoted to the advancement of experiential learning. Experience Based Learning Systems, Inc. Retrieved August 23, 2008, from http://www.learningfromexperience.com

Experience Based Learning Systems, Inc. “is a research and development company devoted to advancement of the theory, research and practice of experiential learning”. The company was founded by David A. Kolb, Professor of Organizational Behavior at the Weatherhead School of Management, Case Western Reserve University, Cleveland, Ohio, USA while the company’s president is Alice Kolb, Adjunct Professor of Organizational Behavior at the Weatherhead School of Management, Case Western Reserve University, Cleveland, Ohio, USA. Curriculum Vitaes detailing credentials and contact information for both parties were provided on the website.

The Experience Based Learning Systems, Inc (EBLS) website provided information concerning: the website’s Authors; Team Learning workshops; Assessment Tools; Research Publications; the EBLS Network; Frequently Asked Questions; and, a web based contact form for inquiries. Of interest were the sections on the Kolb Learning Style Inventory (LSI) Version 3.1 and the Kolb Adaptive Style Inventory (ASI). These tools are not free and are offered at varying rates for single or multiple questionnaires. Technical specifications of the Kolb LSI, Version 3.1, 2005 were provided as a downloadable document. The Frequently Asked Questions page provided additional details about learning styles and Kolb’s LSI.

The Kolb LSI (KLSI), version 3.1, technical specifications looked to be of interest with regard to the stated validity of the KLSI in a number of educational specialisations. The comparisons made to other learning style inventories, however, appeared to concern other models designed or co-designed by Kolb. It was evident that the KLSI is a commercial tool and hence not as freely accessible as the Felder-Solomon Index of Learning Styles instrument.

 

Mestre, L. (2006). Accommodating Diverse Learning Styles in an Online Environment. Reference & User Services Quarterly, 46(2), 27-32.  Retrieved August 13, 2008, from Academic Research Library database. (Document ID: 1192839391).

A qualitative research paper aimed at librarians and instructors by Associate Professor Lori Mestre, Library Administration, University of Illinois, Urbana-Champaign, USA.

Mestre provided a concise outline of the existing theories of learning styles and then contrasted the differing learning styles of diverse groups with reference to field dependence verses field independence learning theory. Kolb’s four-stage learning cycle was discussed along with Kolb’s identification of four differing learning styles. Mestre then provided an overview of Honey and Mumford’s work in modifying Kolb’s learning styles and how those might respond to an online learning environment. Mestre also described two emerging types of learners: Global Learners and Millennials which indicated a need for instructional change. Finally, Mestre provided suggestions for enhancing teaching for the online environment.

Mestre’s contrast of field dependence with field independence learning theory, though referring specifically to North American students, served the purpose of highlighting the importance of identifying students learning styles and needs. The section on Kolb was extremely useful in identifying Kolb as a major point of reference in the investigation of learning styles. The identification of Learners and Millennials would seem to validate what educators such as Prensky have described as the new challenge facing educators. However the suggestions provided by Mestre to enhance teaching for the online environment were superficial at best and did not provide any new and meaningful solutions as to the management of courseware to accommodate students with diverse needs.

Overall, a very informative paper in regard to the background of learning style theory and associated applications to the online teaching environment.

 

Moallem, M. (2008). Accommodating Individual Differences in the Design of Online Learning Environments: A Comparative Study. Journal of Research on Technology in Education, 40(2), 217-245.  Retrieved August 13, 2008, from Academic Research Library database. (Document ID: 1447280671).

A qualitative and quantitative research paper aimed at instructional designers, online and distance educators, by Professor Mahnaz Moallem, Instructional Technology and Research, University of North Carolina, Wilmington, Watson School of Education, USA.

Moallem detailed her study of 14 students who participated in an online learning environment that Moallem devised. Moallen described the design of the online environment which included “three main characteristics” of instruction: self-awareness, meta-cognition and problem solving along with Kolb’s experiential learning theory as a framework. Moallem then outlined five quantitative instruments that were used to chart the students’ progress and outcomes in the course. Moallem concluded that while limitations in the study such as the limited sample size and number of evaluated modules in the single case study were evident the inclusion of learning styles into the design of online learning materials was possible.

The reasoning provided by Moallen in the design of the online environment gave a unique insight into the Author’s development process. The tables provided by Moallem were valuable guidelines that broke down Kolb’s learning styles into achievable development solutions. The five quantitative instruments Moallem used provided useful ways and means of capturing a range of data to evaluate an online learning environment. Moallem’s finding that student satisfaction levels were retained in her online environment was also worthy of note.

A thorough and informative paper including Moallem’s very insightful observation:
if an online learning environment provides social interaction, collaboration and problem solving then students will adapt their preferred learning style to achieve a positive outcome.

 

Mupinga, D. M., Nora, R. T., & Yaw, D. C. (2006). THE LEARNING STYLES, EXPECTATIONS, AND NEEDS OF ONLINE STUDENTS. College Teaching, 54(1), 185-189.  Retrieved August 13, 2008, from ProQuest Education Journals database. (Document ID: 1004646511).

A qualitative and quantitative research paper aimed at online educators, by Professor Davison M. Mupinga and Assistant Professor Dorothy Carole Yaw, School of Technology, Indiana State University, Terre Haute, USA, and Robert T. Nora, Chaior, Baccalaureate Program, Vincennes University, Vincenenes, Indiana, USA.

Mupinga et al. provide a brief yet informative account of students’ learning styles and individual needs which are then compared to an online learning environment making the observation that online students’ learning styles are unknown. Mupinga et al.’s qualitative research finds that those who take online courses often do so out of convenience as opposed to that style meeting their particular learning needs. Consequently Mupinga et al. undertook a quantitative study of 131 undergraduate students to determine individual learning styles, expectations and needs, using an online Myers-Briggs personality type indicator and a single, double barrelled open-ended survey question. Their research outcomes were stated as three generalised approaches toward accommodating individual online learning needs.

Mupinga et al. appear to falter with the quantitative data collected through a Myers-Briggs online personality survey and through the student responses to the double-barrelled open-ended question. The responses to the open-ended question “What are your needs and expectations as an internet student?”, though interesting, were not correlated against the 16 Myer-Briggs personality types. As a result Mupinga et al. were only able to conclude that online courses needed to cater for multiple learning styles. Their conclusion, though supporting their initial qualitative research, provided little additional or useful information.

 

Speth, C. A., Lee, D. J., & Hain P. M. (2006). Prioritizing Improvements in Internet Instruction Based on Learning Styles and Strategies. Journal of Natural Resources and Life Sciences Education, 35, 34-41.  Retrieved August 13, 2008, from ProQuest Education Journals database. (Document ID: 1222181221).

A quantitative research paper aimed at identifying improvements in an existing online course by Carol Speth, educational psychologist, Donald Lee, lecturer, genetics and crop engineering, and Patricia Hain, internet course developer, University of Nebraska, USA.

Speth et al.’s paper outlined a three step process for the evaluation of learners’ styles:
[1] identification of a theoretical framework and related questions; [2] categorisation of the students; and [3] evaluation of the lessons based upon those categories. Using learning approaches: motivation, intention and study method; as opposed to learning styles, a quantitative analysis of 300 students studying 20 online modules over 5 semesters was undertaken. Data was collected through: [1] an evaluation form, based upon the Approaches to Studying Inventory, of six key elements of the lessons: objectives, text, animations, glossary, images and quizzes; and [2] results of an Approaches and Study Skills Inventory for Students survey that identified students as either having a deep, strategic or surface approach to study. Speth et al. concluded that the research data identified several design and navigational issues that may have had a negative bearing on the students’ recorded responses.

It appears that the online materials Speth et al. evaluated were designed in the late 1990′s with the navigational and design issues counter-productive in validating the learning approaches identified. This, one feels, affected the currency of their research. It did, however, show differences, and similarities, between the three learning approaches to the six key elements investigated. In all an informative introduction to learning approaches as opposed to learning styles as well as a descriptive explanation of Speth et al.’s research methodology.

 

Sun, S., Joy, M., & Griffiths, N. (2007). The Use of Learning Objects and Learning Styles in a Multi-Agent Education System. Journal of Interactive Learning Research, 18(3), 381-398.  Retrieved August 13, 2008, from ProQuest Education Journals database. (Document ID: 1317096231).

A qualitative and quantitative research paper addressing the design of an intelligent computer based learning system aimed at computer scientists by Shanghua Sun, Mike Joy, and Nathan Griffiths, University of Warwick, United Kingdom. [No further details provided.]

Sun et al.’s detailed technical design paper discussed the identification of a suitable learning style theory to be used as a basis for the design of an intelligent system that adapted to the changing needs of students. Their proposed intelligent system combined the use of learning objects, learning style theories and multi-agent technology. The learning style theory identified by Sun et al. was the Felder-Silverman Learning Style Model which defines four dimensions of learning style preferences: sensing or intuitive; visual or verbal; active or reflective; and, sequential or global. Sun et al. then described how they broke down the Felder-Silverman questionnaire to be more manageable by their intelligent system.

The majority of Sun et al.’s paper was, in many respects, highly technical, although it did illustrate that an online learning system, if it is to truly cater for each individual’s specific learning styles and needs, requires more than nice graphics, animations and other multimedia elements to address those divergent needs. The explanation by Sun et al. as to their selection of their preferred learning style theory, as well as further description of how they adapted that chosen style to suit their intended needs, was particularly insightful. One felt Sun et al. demonstrated the importance of creating manageable systems through the adaptation of pre-existing models opposed to “bolting-on” existing models in the hope they work as intended.

In conclusion: an informative investigation into the design of an intelligent learning system.

 

Trochim, W. M. K. (2006). The Research Methods Knowledge Base (2nd ed.).  Retrieved August 23, 2008 from http://www.socialresearchmethods.net/kb/index.php

Aimed at undergraduate or graduate students the Research Methods Knowledge Base (KB) is defined as a web-based text book that addresses “all of the topics in a typical introductory … course in social research methods”. It was indicated that the KB website content was written by Professor William M. K. Trochim, Department of Policy Analysis and Management, Cornell University, Ithaca, New York, USA. The complete bibliography outlining Trochim’s publications were available on the KB parent web site which attested to Trochim’s significant work in the areas of research, statistics and concept mapping.

The KB website covered the following headings concerning research fundamentals; Foundations, Sampling, Measurement, Design, Analysis, and Write-Up. The online version of Trochim’s KB consisted of the previously mentioned 6 headings or ‘chapters’ whereas the printed version consisted of 5 chapters although the same content sub-headings appeared to have been covered in each version. The printed version of the book also listed James P. Donnelly, Ph. D. as a co-author yet no mention was made on the KB website of his input into that content. Mention was made however of the existence of a more ‘sophisticated’ commercial version of the KB online website published through Atomic Dog Publishing.

Each heading or chapter on the online KB contained additional sub headings with some drilling down further for additional detail. Unfortunately the amount of information covered in the KB was so vast that it precluded detailed description in this bibliography entry. Yet Trochim’s language on the online version of the KB generally appeared to be very accessible despite the content covered being quite technical in nature. The language used was conversational with the use of contractions which took the reader’s focus outside of the more usual academic approach. Conclusion: an interesting reading with a logical flow of concepts.

 

Part B:

Mupinga, Nora and Yaw’s article “THE LEARNING STYLES, EXPECTATIONS, AND NEEDS OF ONLINE STUDENTS” (2006) was selected for further evaluation as [1] the title was reflective of this writer’s topic for investigation through the development of a survey to identify learning styles and needs for online educational games as support materials, and [2] further exploration was needed to determine whether Mupinga et al. could have formulated their research in a manner that obtained the necessary data to address the stated purposes of their study. While Mupinga et al. did make sound observations through their qualitative research about individual learning styles and characteristics (p. 185) it was felt that Mupinga et al. failed to adequately identify through their research how “the identified characteristics can be incorporated in designing effective online instruction” (p. 186). With the intent of undertaking research of one’s own in the area of online instruction it is desirable that the results and findings from that research are reliable and sound. Therefore further evaluation of Mupinga et al.’s approach to their research may provide a valuable insight as to how possible problems in one’s own research methods might be addressed before they arise.

Mupinga et al. defined two methods that they utilised in obtaining the quantitative data for their study which “sought to determine the learning styles, expectations and needs of online industrial design students. Further, the study explored how the identified characteristics can be incorporated in designing effective online instruction” (2006, p. 186). The two methods employed by Mupinga et al. were “an informal and free online Myers-Briggs Cognitive Style Inventory personality test” (p. 186) and through “responses to an open-ended question: “What are your needs and expectations as an Internet student”" (p. 186). The appropriateness, validity and overall effectiveness of these two methods will be discussed as three separate, though related, concerns:

The first concern relates to the selection by Mupinga et al. (2006, p. 186) of the Myers-Briggs Type Indicator as a method to determine student learning styles. The Myers-Briggs Type Indicator (MBTI) as developed by Myers and Briggs identifies personality types as “a structure of sixteen types based on four dichotomies” (Myers & Myers, ¶ 3) which are defined as: Extraversion or Introversion; Sensing or Intuition; Thinking or Feeling; and, Judging or Perception (Myers & Myers, ¶ 4). These sixteen combinations or types are expressed as INTJ, INFJ, ISTJ, ISFJ, INTP, ISTP, INFP, ISFP, ENTP, ENFP, ESTP, ESFP, ENTJ, ESTJ, ENFJ and ESFJ. It is those sixteen personality types that Mupinga et al. sought to identify in their sample and which they represented as a frequency distribution in TABLE 1 (2006, p. 186).

In the broadest sense the MBTI classes people as either “Judging” or “Perceiving” (Center for Applications of Psychological Type, 2008, ¶ 2). In the most descriptive sense the MBTI describes the sixteen types identified in the paragraph above. It could be argued that “Judgement” or “Perception” on their own do not provide enough meaningful information as to a student’s “identified characteristics” (Mupinga et al. 2006, p. 186) whereas the sixteen personality types may be too fine a measurement to be manageable in a learning system. This last point could perhaps be validated by Sun, Joy and Griffiths (2007, p. 384) who indicated that one of the reasons they selected the four dimensional Felder-Silverman Learning Style Model was due to “the number of dimensions of the model is constrained, improving the feasibility of its implementation”.

While the MBTI could also be viewed as a four dimensional model in much the same way as the Felder-Silverman model as employed by Sun et al. (2007, p. 384) “The indices E-I, S-N, T-F, and J-P are designed to point in one direction or the other. They are not designed as scales for measurement of traits or behaviors [sic]” (Center for Applications of Psychological Type, 2008, ¶ 25). It could therefore be argued that the Felder-Silverman model in identifying the student’s learning style preferences within its four dimensions of Sensing or Intuitive, Visual or Verbal, Active or Reflective, and Sequential or Global (Sun et al. 2007. P. 384) is not only immediately more descriptive from a labelling perspective but also more adaptable to an individual’s learning styles in its application compared to the MBTI model. The MBTI model on the other hand, as described by the Center for Applications of Psychological Type, is a complex set of interrelations between the four dichotomies (¶ 19):

The theory postulates specific dynamic relationships between the preferences. For each type, one process is the leading or dominant process and a second process serves as an auxiliary. Each type has its own pattern of dominant and auxiliary processes and the attitudes (E or I) in which these are habitually used. The characteristics of each type follow from the dynamic interplay of these processes and attitudes.

The question that one might now ask is “Why did Mupinga et al. select the Myers-Briggs model”? Mupinga et al. did not give their reasoning for their choice of the MBTI model except that it was “free” (2006, p. 186). If one were to view the website that Mupinga et al. obtained the free MBTI online test from (p. 186, p. 189) then one might read that the MBTI, according to Reinhold on his webpage titled “MBTI & Myers-Briggs Personality Type Introduction – Personality Pathways” (2006, ¶ 1), is “the world’s most widely used personality inventory”. Are “free” and “most widely used” appropriate reasons to use what is described as a “personality instrument” (Center for Applications of Psychological Type, ¶ 5) to assess individual student’s learning styles? Would one reasonably be expected to research the learning style indicator instruments available and determine their suitability to the task at hand when that tool would provide a significant part of the empirical data obtained?

In response to the last question posed further research would perhaps indicate that the selection of an instrument to measure student learning styles “Given the controversial literature on learning styles” (Moallem, 2008, p.218) would be explored and the findings provided to validate the basis of the research method(s) and outcomes identified or proposed.  Moallem did indeed do this (pp. 218-220), as did Cassidy (2006, pp. 171-172); Choi, Lee, and Jung. (2008, pp. 7-9); Mestre (2006, pp. 27-29); Speth, Lee, and Hain (2006, pp. 34-35); and Sun et al. (2006, pp. 383-384). While the lack of identified research to justify their use of the MBTI model does not necessarily invalidate Mupinga et al.’s findings that their study “did not identify a particular learning style to be predominant with this group of online undergraduate students” (2006, p. 187) it did raise some concerns about their choice of methodology to assess student learning styles.

What did, however, indicate a valid concern about the validity of the method that Mupinga et al. chose in obtaining their MBTI data was the following (2006, p. 186):

The students completed an informal and free online Myers-Briggs Cognitive Style Inventory personality test to explore their personality type (see Reinhold 2004). For most accurate personality scores, using the official MBTI® inventory from professionals qualified to administer the test is recommended. [italics added]

This is what Reinhold (2006, ¶ 5-6) stated about the Cognitive Style Inventory provided on his website page titled “MBTI Personality Test: Understanding Your MBTI or Myers Briggs Personality Type” that Mupinga et al appeared to have used in their survey:

This modest self-scoring inventory is Not a substitute for taking an MBTI ®. It is simply an introduction to personality type or psychological type. We hope it whets your appetite for learning more about the Myers and Briggs model of personality development and its message of increased human understanding.

The Style Inventory will allow you to approximate what are your MBTI Type preferences. After determining your 4 Type letters, you can jump to a number of links we have provided to help you get acquainted with the characteristics and indicators of the 16 types and verify if your type, as determined by this “unscientific” survey, seems to “fit” or not. [italics added]

In addition Salter, Evans and Forney (2006) stated in their paper “A Longitudinal Study of Learning Style Preferences on the Myers-Briggs Type Indicator and Learning Style Inventory” (p. 182):

The ethical use of psychological tests (American Counseling [sic] Association, 1995), like the two in this study, requires a practitioner to understand the psychometric properties of the scores they produce (E.6.a) and to communicate correct information about results (E.2.d). Regardless of the area of practice, many sources of measurement error can impact on an individual’s results, especially the testing environment. Our own experiences suggest that underinformed [sic] and unethical administration of these assessments produces inconsistent results …

Therefore based upon the “MBTI” survey used and quite possibly the method in which it was conducted it could perhaps be concluded that Mupinga et al.’s data was invalid.

The second concern one found with Mupinga et al.’s method of obtaining quantitative data concerned the open-ended question “What are your needs and expectations as an Internet student” (2006, p. 186). As previously stated the question posed was a double-barrelled question and therefore should ideally have been broken down into two distinct questions: “What are your needs as an internet student” and “What are your expectations as an internet student”. ‘Needs’ and ‘expectations’ might consist of diametric responses; for example: a student may feel they need one-on-one tuition whereas their expectations of that transpiring as an internet student might be low. This last point indicates the need of a scale such as a Likert response scale to measure the students’ overall responses as to how they rate each question’s importance or relevance to their particular needs.

Despite stating that only one open-ended question was asked Mupinga et al. declared: “Based upon frequencies of responses to the open-ended question, the top three expectations of online students were communication with the professor, instructor feedback, and challenging online courses” (2006, p. 186). Similarly: “Based upon frequencies of responses to the open-ended question, the top four needs of online students were technical help, flexible and understanding instructors, advance course information, and sample assignments” (p. 187). One might therefore ask: “How were those ‘frequencies’ of responses measured and how were those variables of the responses categorised?” The method Mupinga et al. used to ascertain the results of those ‘responses’ to the open-ended question were left unanswered hence the validity of the data was questionable.  One might assume that key-words were used to determine the frequencies of ‘responses’ which also suggested that a degree of bias was introduced into the overall data. Also, being limited by the lack of a method to measure the importance each student placed on their responses may indicate that the data collected “is of a meagre and unsatisfactory kind” (William Thomson, (cited in Cary, 1999, ¶ 29)).

The third concern is that there was no correlation with the data obtained from the ‘MBTI’ survey with the responses to the open-ended question posed by Mupinga et al. in an attempt to identify each student’s preferred learning style. Apart from the concerns raised about the validity of the data obtained from the open-ended question one now questioned how “the identified characteristics can be incorporated in designing effective online instruction” (Mupinga et al., 2006, p. 186) when there had been no identified attempt to determine whether the characteristics of a particular learning style might govern a student’s perceived expectations and needs. Yet Mupinga et al. indicated in the opening of their paper: “… the student’s learning characteristics are unknown, making it difficult to design effective instruction. Therefore, to maximise the student’s learning experiences, instructors need to be sensitive to diverse learning styles, needs, and expectations, and understand the online environment” (p. 185) [italics added]. It did however appear that Mupinga et al.’s quantitative research, based upon their published findings, still left as yet unknown those student learning characteristics.

As Mupinga et al. did recommend that “further studies be conducted with different groups of online students using the personality or learning styles inventories” (2006, p. 188) one did see the benefit of replicating the research undertaken by Mupinga et al. assuming the following three needs were expanded upon:

The first need being that the examination of the available learning style inventories be undertaken and documented to evaluate the suitability and accountability of each of the identified inventory tools. Suitability might be defined here as: the dichotomies or dimensions of the each tool investigated is able to be mapped against training tasks and design outcomes such as those portrayed by Moallem (2008, pp. 222-223); and, that it can be determined that each tool is capable of being adapted in order to be more manageable in a computer based learning management system such as discussed by Sun et al. (2007, pp.387-389). Accountability might be defined here as identifying the conditions under which the inventory must be employed in order to make it valid and that the data that results from such conditions can be recreated and validated. Mestre (2006) serves as a good basis for the identification and suitability of learning style inventories while Salter et al. (2006) is representative of an undertaking of a learning style inventory’s accountability.

The second need is that the survey to be undertaken to ascertain students’ needs and expectations to online course delivery be designed to gauge a range of anticipated responses including the relative importance held by the surveyed students for each response given. Such a survey might include a number of predetermined close-ended questions including those using a Likert response scale to rate the value of each response given plus open-ended questions to capture additional responses that the survey may not have covered. Study into the construction of the survey itself should be undertaken as there “are numerous small decisions that must be made – about content, wording, format, placement – that can have important consequences for your entire study” (Trochim, 2006, ¶ 1).

The third need, that which Mupinga et al. (2006) neither addressed nor identified in their paper, is a method to determine the correlation, if any, between the data gathered from the chosen student learning style inventory method and the responses elicited from the student needs and expectations questionnaire. It has already been discussed that without such a correlation it could be questioned as to how student learning characteristics could be ascertained to “to maximise the student’s learning experiences” (Mupinga et al., p. 185). Hence, it may well be reasoned, it is this proposed third need that over-arches the underpinning methodologies employed in the replication of such a research topic.

In conclusion: the review of Mupinga et al.’s research paper (2006) has proved to be a particularly valuable exploration into what might be constituted as the basis toward the design and development of a qualitative and quantitative research based paper for this writer’s future research endeavours concerning “learning styles and needs for online educational games as support materials”. One is now more mindful of the importance of establishing the basis of the research methodology(s) employed along with the validation and verification of any outcomes proposed. One now holds the view that it is not appropriate to collect quantitative data and then propose theoretical research outcomes through inductive means as Mupinga et al., (pp. 187-188) would appear to have done. In doing so, one feels, the outcomes suggested by the research would just be veiled fabrications of the initial dissertation which ultimately serves no-one except for those whom proposed the research paper to begin with. In other words, research simply for research sake.

Apart from the identification and validation of the survey methodologies chosen one may need to examine how the data collected through the application of those methodologies could be correlated and evaluated. As it could be assumed that student data collected from the selected learning style inventory and a “needs and expectations” survey in one’s own research would be of a multidimensional nature further exploration and study into that area of statistical analysis would need to be undertaken. Initial investigation into the multidimensional “branch of mathematical statistics” (Aivazyan, 2001, ¶ 1) indicated that one would need to either undertake additional units in one’s own Masters course of study or, preferably, team up with someone literate in that area of mathematical statics to correlate and evaluate the anticipated data obtained. As that conclusion had not been anticipated one feels that this research topic has proved to be an invaluable exercise.

 

 References:

Aivazyan, S. A. (2001). Multi-dimensional statistical analysis, multivariate statistical analysis. SpringerLink. Retrieved August 22, 2008, from http://eom.springer.de/m/m065140.htm

CAPT. (2008). About the MBTI instrument: Jung’s Theory of Psychological Types and the MBTI® Instrument. Center for Applications of Psychological Type. Retrieved August 20, 2008, from http://www.capt.org/take-mbti-assessment/mbti-overview.htm

Cary, D. (1999). Quotes – Science. Retrieved August 20, 2008 from http://rdrop.com/~cary/html/science_quotes.html

Cassidy, S. (2006). Learning style and student self-assessment skill. Education & Training, 48(2/3), 170-177.  Retrieved August 13, 2008, from ABI/INFORM Global database. (Document ID: 1039959161).

Choi, I., Lee, S. J., & Jung, J. W. (2008). Designing Multimedia Case-Based Instruction Accommodating Students’ Diverse Learning Styles. Journal of Educational Multimedia and Hypermedia, 17(1), 5-25.  Retrieved August 13, 2008, from Academic Research Library database. (Document ID: 1425738751).

Mestre, L. (2006). Accommodating Diverse Learning Styles in an Online Environment. Reference & User Services Quarterly, 46(2), 27-32.  Retrieved August 13, 2008, from Academic Research Library database. (Document ID: 1192839391).

Moallem, M. (2008). Accommodating Individual Differences in the Design of Online Learning Environments: A Comparative Study. Journal of Research on Technology in Education, 40(2), 217-245.  Retrieved August 13, 2008, from Academic Research Library database. (Document ID: 1447280671).

Mupinga, D. M., Nora, R. T., & Yaw, D. C. (2006). THE LEARNING STYLES, EXPECTATIONS, AND NEEDS OF ONLINE STUDENTS. College Teaching, 54(1), 185-189.  Retrieved August 13, 2008, from ProQuest Education Journals database. (Document ID: 1004646511).

Myers, K. D., & Myers P. B. (2006). HOW DID JUNG’S EIGHT TYPES BECOME MYERS’ SIXTEEN TYPES? MBTI Type Today. Retrieved August 18, 2008, from http://www.mbtitoday.org/16types.html

Reinhold, R. (2006). MBTI & Myers-Briggs Personality Type Introduction – Personality Pathways. Personality Pathways. Retrieved August 17, 2008 from http://www.personalitypathways.com/MBTI_intro.html

Reinhold, R. (2006). MBTI Personality Test: Understanding Your MBTI or Myers Briggs Personality Type. Personality Pathways. Retrieved August 20, 2008 from http://www.personalitypathways.com/type_inventory.html

Salter, D. W., Evans, N. J., & Forney, D. S. (2006). A Longitudinal Study of Learning Style Preferences on the Myers-Briggs Type Indicator and Learning Style Inventory. Journal of College Student Development, 47(2), 173-184.  Retrieved August 13, 2008, from ProQuest Education Journals database. (Document ID: 1012400091).

Speth, C. A., Lee, D. J., & Hain P. M. (2006). Prioritizing Improvements in Internet Instruction Based on Learning Styles and Strategies. Journal of Natural Resources and Life Sciences Education, 35, 34-41.  Retrieved August 13, 2008, from ProQuest Education Journals database. (Document ID: 1222181221).

Sun, S., Joy, M., & Griffiths, N. (2007). The Use of Learning Objects and Learning Styles in a Multi-Agent Education System. Journal of Interactive Learning Research, 18(3), 381-398.  Retrieved August 13, 2008, from ProQuest Education Journals database. (Document ID: 1317096231).

Trochim, W. M. K. (2006). Survey Research. The Research Methods Knowledge Base (2nd ed.). Retrieved August 21, 2008 from http://www.socialresearchmethods.net/kb/survwrit.php

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Designed by TigerLeo - Powered by WordPress