老司机福利网

QEP Proposal

 QEP PROPOSAL LINKS
EXECUTIVE SUMMARY
LITERATURE REVIEW AND BEST PRACTICES
ORGANIZATIONAL STRUCTURE
INSTITUTIONAL PROFILE
DESIRED STUDENT LEARNING OUTCOMES
ASSESSMENT AND EVALUATION
PROCESS USED TO DEVELOP THE QUALITY ENHANCEMENT PLAN
ACTIONS TO BE IMPLEMENTED 
GLOSSARY
IDENTIFICATION OF THE TOPIC
TIMELINE

EXECUTIVE SUMMARY

TEAM USA focuses on improving student learning outcomes in STEM courses by increasing critical thinking and collaboration using Team-Based Learning instructional strategies.  As such, the goal of the project is for students to achieve higher mastery levels of course content through the application of collaborative learning and critical thinking to real-life pedagogical scenarios.  TEAM USA is aligned with the University鈥檚 mission to 鈥渁pply knowledge in service to the people of Alabama as citizens of a global community鈥 and also with goal #1 of the University鈥檚 strategic plan, 鈥淭o build upon the academic quality and learning environment of the University.鈥 

There was broad-based involvement of constituents, students, faculty, administrators, community members, and alumni throughout the planning and implementation stages of the project.  These constituencies will also also involved in the management and oversight of the project.   Constituents provided feedback at several public hearings and through various data collection mechanisms.  Representatives were also invited to serve on the Leadership Team, the Concept Development and Selection Committees, the Implementation Team, and the Advisory Council.  During this process, it was determined that student needs were most critical in Science, Technology, Engineering, and Mathematics (STEM) courses, and therefore the decision was made to focus the QEP on improving learning in those courses.

Using Bloom鈥檚 Taxonomy of higher order thinking (1956), content acquisition and application targeted in learning outcomes will be constructed around each student鈥檚 ability to evaluate information, synthesize alternate versions of problems or issues, and base conclusions or solutions on applied information.  Mastery of student learning outcomes will be facilitated by the use of Team-Based Learning, a collaborative instructional strategy shown to improve critical thinking skills.  Team-Based Learning has also been effective across multiple disciplines including STEM (Clark, Nguyen, Bray, & Levine, 2008; Haberyan, 2007; Thompson, Schneider, Haidet, Levine, McMahon, Perkowski, & Richards, 2007).

Multiple formative and summative assessments such as the ETS Proficiency Profile, the National Survey of Student Engagement, course-embedded assessments, student and faculty satisfaction surveys, and peer review will be utilized during the project to gauge improvements in mastery levels of student learning outcomes and to guide project development.

Institutional accountability for the delivery of TEAM USA is placed upon the Senior Vice President of Academic Affairs.  The QEP Director is responsible for day-to-day management and delivery of the project including participant recruitment and selection, professional development, fund management, and oversight of the TEAM USA classroom.  The QEP Director is also responsible for compiling and presenting annual project reports.

 

INSTITUTIONAL PROFILE

The 老司机福利网鈥檚 mission is to offer high-quality teaching, research, public service, and healthcare programs that create, communicate, preserve, and apply knowledge in service to the people of Alabama as citizens in a global community. The vision of the University is to become a preeminent comprehensive university that is recognized for its intellectual, cultural, and economic impact on the health and well-being of those we serve as leaders and citizens in a global community (2012-2013 Undergraduate/Graduate Catalogue, p. 4). 

The 老司机福利网 is an urban, public, regional institution with a commitment to the development of human capital through exemplary practices in teaching, research, and service to the community.  The 老司机福利网 was created by act of the Alabama State Legislature in May 1963 and is the only public doctoral-level institution of higher learning on the upper Gulf Coast. The University is strategically located in the greater Mobile area, which has a population of more than a million within a 100-mile radius. From an initial enrollment of less than 1,000 students, the University has expanded to an enrollment of approximately 15,000 students and is one of the fastest-growing universities in Alabama.  Over 70,000 degrees have been awarded at USA in its fifty-year history.  One-third of area physicians and over 85% of all teachers in the Mobile Public School System received their training at USA.  Additionally, over 8,800 nurses, 11,000 business and accounting professionals, 6,500 engineers, and 4,900 allied health professionals completed a degree at USA. The University of 老司机福利网 Alabama is also a major healthcare provider for the region. USA Health University Hospital, USA Children鈥檚 and Women鈥檚 Hospital, USA clinicians, and the Mitchell Cancer Institute offer high-quality treatment to over 250,000 patients per year.

The 老司机福利网 has been able to serve many students including those who otherwise may not have had the opportunity to pursue an undergraduate, graduate, or professional degree.  Seventy-four percent of students at USA are full-time and a large number of students are Pell grant recipients, first generation college students, and underserved minorities.  Over 90% of students at USA receive some type of financial assistance with 75% of these students receiving grants and 54% receiving loans.

The 老司机福利网 has become a major research institution and has several research centers and institutes: the Center for Archaeological Studies, the Center for Business and Economic Research, the Center for Forensics and Information Technology Security, the Center for Integrative Studies in Science, Technology, Engineering, and Mathematics, the Center for Lung Biology, the Coastal Weather Research Center, Center for Real Estate Studies and the USA Mitchell Cancer Institute.  Externally funded contracts and grants amounted to $52.8 million in 2011 and revenue from licensing of inventions and innovations was $2.4 million.  The university also offers a wide range of degrees in ten colleges and schools.  Degrees are awarded at the undergraduate, graduate, and doctoral levels.

The University has an annual payroll of $404 million with over 5,500 employees and is the second largest employer in Mobile, Alabama. It has remained one of Alabama's fastest growing universities for the past several years.  The University of 老司机福利网 Alabama has an annual economic impact of two billion dollars and generates more than seven dollars for every one dollar provided by the state of Alabama.  

 

PROCESS USED TO DEVELOP THE QUALITY ENHANCEMENT PLAN

Foundational Planning

The process for developing the Quality Enhancement Plan was consistent with the planning principles and practices developed by Todd Littman (2011).  The principles and practices followed were:

  • Comprehensive 鈥 all significant options and impacts are considered
  • Efficient 鈥 the process will not waste time or money
  • Inclusive 鈥 stakeholders affected by the plan have opportunities to be involved
  • Informative 鈥 results are communicated to, and understood by stakeholders
  • Integrated 鈥 individual, short-term decisions support strategic, long-term goals
  • Logical 鈥 each step leads to the next
  • Transparent 鈥 all participants understand the process

The 老司机福利网 Quality Enhancement Plan was developed through a deliberate four-phase process involving all university stakeholder groups.  This process was initiated during the 2011-2012 academic year. The first phase involved the formation of the SACS accreditation Leadership Team during the summer of 2011; the second phase involved the creation of the QEP Development Committee during the fall of 2011; the third phase involved the creation of the QEP Implementation Team during the spring of 2012; and the fourth phase involved the creation of the QEP Advisory Council during the summer of 2012.  Multiple constituencies including students, faculty, administrators, and support personnel were represented in the process.

The Leadership Team (Table 1) was responsible for supervising the reaffirmation process.  The Leadership Team managed and validated compliance certification, and the development of the QEP.  The Leadership Team was modified in August 2012. The late Dr. Joan Exline (formerly Associate Vice President of Institutional Research, Planning, Assessment and Regional Campuses and SACS liaison) was replaced by Dr. Charles Guest (Interim Associate Vice President of Institutional Research, Planning and Assessment), Dr. Philip Carr (2012-13 Faculty Senate Chair) replaced Dr. Jim Connors (2011-12 Faculty Senate Chair), and Dr. Wanda Maulding (SACS Liaison) was added.

Table 1  
QEP Leadership Team

Member

Title

Gordon Moulton

President

John Smith

Vice President for Student Affairs and Special Assistant to the President

G. David Johnson

Senior Vice President for Academic Affairs

Ron Franks

Vice President for Health Sciences

Wayne Davis

Vice President for Financial Affairs

Charles Guest (replaced Joan Exline*)

Interim Associate Vice President for Institutional Research, Planning and Assessment

Wanda Maulding (added)

SACS Accreditation Liaison

Phil Carr (replaced Jim Connors)

Faculty Senate Representative

*Dr. Exline became ill in 2011 and passed away in August, 2012.

The QEP Development Committee was appointed by the Leadership Team.  Members of the QEP Development Committee are listed in table 2.

Table 2 
QEP Concept Development Committee

Member

Title

Agnew, Andrea

Assistant to the Dean, Student Services

Carr, Nicole

Director, Student Academic Success

Coleman, Robert

Associate Dean, Arts and Sciences

Delmas, Peggy

Associate Professor, Leadership and Teacher Education

Dempsey, Jack

Director, Center for Innovation in Learning

Doughtery, Carroll

Assistant Professor, Mechanical Engineering

Devore, Donald

Associate Professor, History

Exline, Joan

Associate Vice-President

Farmer, Joseph

Assistant Professor, Nursing

Gessner, Lauren

Student

Hussain, Zain

Student

Lagan, David

Associate Professor, Computer and Information Science

Martin, Cecelia

Director of Assessment

Millner, Vaughn

Dean, Continuing Education

Spake, Deborah (Chair)

Associate Dean, Business

Talbott, Richard

Dean, College of Engineering

Thomas, Crystal

Assistant to the Dean, Arts and Sciences

West, Kevin

Assistant Professor, Chemical and Biomolecular Engineering

The process of developing the QEP began on January 24, 2011 with a meeting of the QEP Development Committee.  The Committee was chaired by Dr. Deborah Spake, Associate Dean of the Mitchell College of Business, and included 17 other members including faculty, students, and administrators.

In the first meeting, Dr. David Johnson, Senior Vice-President for Academic Affairs, made comments about the importance of selecting a QEP topic. Dr. Johnson also reviewed the purpose of the committee, which was to coordinate a participative process to identify opportunities for a project that would improve student learning. Dr. Johnson delegated the responsibility of coordinating this participative process to the QEP Development Committee and emphasized the importance of the QEP process to the University. During this same meeting, an overview of the SACS Reaffirmation Process was presented by Dr. Joan Exline, the Associate Vice President for Institutional Research, Planning, Assessment and Regional Campuses. As a part of this overview, Dr. Exline discussed the QEP development process and the key steps for developing the QEP for the University.

The QEP Development Committee was tasked with soliciting and evaluating ideas for a topic that could improve student learning and be developed into a full proposal. The topic had to show promise for achieving meaningful improvements in student learning, develop previous assessment findings at USA, be manageable in terms of scope, be amenable to assessment, be affordable, and be consistent with the University Mission. Committee members were encouraged to become familiar with the reviewer guidelines and the QEP timeline. Dr. Exline explained that topics recommended by the QEP Development Committee would be evaluated by the Leadership Team and then considered by President Moulton for approval. Once the topic was selected, the QEP Implementation Team would be formed to pilot and implement the project.

The QEP Development Committee also reviewed a PowerPoint presentation of data pertaining to university strengths and weaknesses. This presentation generated much discussion.  The data summaries reviewed may be found in Table 3.

Table 3
Data Summaries

Data Source

2010

2011

Graduate Rates*

37%

39%

Retention rates**

65%

66%

ETS proficiencies (Critical Thinking)

Proficient--10%
惭补谤驳颈苍补濒鈥21%
Not Proficient鈥70%

N/A

NSSE results for seniors***:
Level of Academic Challenge
Active and Collaborative Learning

 

 

N/A

56.5

N/A

48.4

Graduating senior鈥檚 perceptions of academic preparation

Excellent--35% 
Good-------52%
Fair---------11%
Poor--------2%

Excellent--31% Good------52%
Fair--------15%
Poor-------2%

Top 25 most difficult courses (as measured by course success rates)

51.5% - 64% (success rate)

51.1% - 67.4% (success rate)

*2010 rate based on 2004 cohort 6 year rate, 2011 rate based on 2005 cohort 6 year rate. USA 
**USA Institute for Research, Planning and Assessment Institute for Research, Planning and Assessment
*** Scores were converted to a 100-point scale with higher scores indicating higher levels     
of perceived competence. 

The committee also discussed:

  • The Success Center Early Alert program (an academic program intended to help students be successful in 100 and 200 level courses)
  • The Supplemental Instruction program (a learning enhancement program)
  • The JagPals program (a peer mentoring program)
  • The First Year Experience course
  • Living Learning Committees (group of students with shared interests)
  • The needs of transfer students

Based on this review of data, the Committee concluded that two areas of student performance merited special attention.  The two areas identified included the ability to work collaboratively with others and critical thinking skills.  Moreover, student retention and graduation rates were identified as lower than those of peer institutions.  At the time, the University had already implemented several initiatives to improve academic success, retention rates, and graduation rates.  These initiatives included expanding the first year experience courses, implementing learning communities, and integrating academics into residence hall settings.  Other initiatives included increasing scholarship opportunities and redesigning courses that were identified as particularly challenging for students (e.g. Developmental Mathematics courses).

Framing the Plan

During a meeting on January 24, 2011, the QEP Development Committee agreed that its first step would be to broadly solicit ideas from stakeholders for the QEP topic.  As a result, a draft of a topic survey document was discussed by the Committee. Also discussed during the meeting was a rubric to evaluate the QEP topic ideas collected from the University community. The Committee felt that it was important to measure perceptions of the appropriateness of the topic for further evaluation and to emphasize innovation in the criteria.

After the development of the survey, discussion ensued as to how to move from 鈥渋dea generation鈥 to 鈥減roposal development鈥 in the most efficient manner. Committee members emphasized that collaboration should be encouraged and that faculty and staff working on those ideas selected for further development would have support from the Office of Institutional Research, Planning, and Assessment and from the Director of Student Academic Success. The group also discussed the importance of developing a concept paper before moving to the full proposal stage. Based on Committee input, Dr. Exline offered to incorporate these suggestions into a written document describing the process for selecting the QEP topic and post it on the website for the Committee to review and revise.

The QEP Development Committee met again on February 10, 2011 to discuss the selection of criteria, procedures, and timelines for the final identification of the QEP topic.  At this meeting, Committee members agreed that links to the sample QEPs from other institutions found on the SACS website should be made available to the members of the Committee, that multiple submissions of topics would not be appropriate, that topics must be connected to the University mission, that topics must fit the student population, and, most importantly, topics must focus on student learning.  A seven-step QEP topic selection process was approved:  1) Communicate the purpose of the QEP; 2) Categorize ideas and evaluate them using a standard rubric format; 3) Solicit concept papers; 4) Evaluate concept papers and select three to five for development into brief proposals; 5) Prepare full length proposals with support from the Office of Institutional Research, Planning and Assessment (IRPA), and the Director of Student Academic Success; 6) Evaluate the full proposals; and 7) Select up to three proposals to forward to the Leadership Team for consideration.

The Committee also discussed the importance of reaching multiple constituencies when communicating the purpose of the QEP and soliciting ideas for the QEP topic. Four methods of reaching these constituencies were suggested including: (a) an e-mail survey distributed to the entire USA community, (b) a Facebook page, (c) a rotating panel on the USA website, and (d) a series of forums with different University groups.

As a follow-up to this meeting, an email was sent to the entire USA community informing them about the QEP and 鈥渟oliciting ideas for broad topics that have the potential of improving the skills of USA students.鈥 Examples included, but were not limited to: research skills, writing, reading, math, presentations, critical thinking, and working with others.  The email directed readers to a survey so University faculty, staff, and students could share their ideas about potential ways to improve student learning at USA.

Open forums regarding the QEP were conducted on February 28, 2011 with the Student Government Association and on March 8, 2011  with faculty and staff at the USA main campus and Baldwin County Campus.

Members of the QEP Development Committee read through the extensive comments collected via the survey and summaries from the open forums held on February 28, 2011 and March 9, 2011.  As a result of these efforts, a list of areas for possible development was generated.  These areas included:

  • Collaborative learning
  • Critical thinking
  • Global citizenship
  • Integrating information
  • Leadership skills
  • Math vocabulary and making a connection
  • Metacognitive skills
  • Organization, time management and study habits
  • Practical application of information
  • Reading with comprehension
  • Undergraduate research
  • Using technology
  • Written communication and writing skills

Concept Paper Evaluation

The QEP Development Committee evaluated 14 potential topics at the 4/7/2011 committee meeting to determine which ones should progress to the concept papers development phase. Members discussed ideas focusing on clarity, sense, scope, innovation, excitement, potential barriers, manageability, and appropriateness as a full proposal.  Prior to this meeting, on an individual basis, committee members reviewed and evaluated topic ideas using a scoring rubric based on a 1 (low) - 5 (high) Likert-type scale. Rubric items were based on the appropriateness of the topic, the relationship of the topic to student learning, the appeal of the topic for students, the uniqueness of the topic, and the potential to generate excitement and commitment from faculty and staff.  The items also included an open-ended question aimed at capturing comments or concerns. Initial ideas and mean rating scores are presented in Table 4.

Table 4
QEP Topic Ratings

Selected Topic

Rating*

Collaborative Learning

4.20

Critical Thinking

4.53

Global Citizenship

3.70

Integrating Information

3.72

Leadership Skills

3.90

Math Vocabulary and Making a Connection

3.67

Metacognitive Skills

3.44

Organization / Time Management / Study Skills

4.00

Practical Application of Information

3.90

Reading with Comprehension

3.96

Teaching Effectiveness

3.40

Undergraduate Research

4.10

Using Technology

4.07

Writing Skills

4.00

 *The higher the score, the higher the rating

After reviewing the results of the topic evaluations, the committee selected the following themes for advancement to the next phase of the selection process, the call for concept papers:

  • Collaborative learning
  • Critical thinking
  • Undergraduate research
  • Using technology
  • Writing skills

Instructions for a concept paper were developed and emailed to the university community during spring 2011. The concept paper cover sheet included open-ended as well as multiple-choice questions.  These questions included (a) the primary role of the submitting professor, (b) the rationale pertaining to the need and connection of the idea to improving student learning and success, (c) the level of innovation required for implementation of the topic, and (d) the intended role of the submitting professor if his/her topic was selected. 

Eighteen concept papers were submitted to the QEP Development Committee for review. A rubric to evaluate the concept papers included the following prompts:

  • The concept paper clearly relates to student learning.
  • The concept paper makes sense, given the mission of USA.
  • The concept paper describes a project that will appeal to a substantial segment of the student population.
  • The concept paper describes an innovative and new endeavor or a significant extension of ongoing efforts to improve student learning.
  • The proposed topic has the potential to generate excitement and commitment with faculty and staff across campus.
  • Any potential obstacles are manageable.
  • The concept paper should be considered for development as a full proposal.

Prior to the next meeting held on June 15, 2011, the papers were uploaded to a website for review by committee members. Members scored the concept papers, submitted their scores, and narrowed down the list to four papers for further development.

 

IDENTIFICATION OF THE TOPIC

Concept Development

The QEP Development Committee carefully selected the topics shown in Table 5 from the 14 potential topics shown in Table 4 to advance to the proposal-writing phase.  The Committee agreed that topics could be combined with related concept papers with the permission of the main topic concept paper author(s).

Table 5 
Concept papers selected for further development

Lead Author

Selected Topic(s)

Related Topics

Nicole Carr

Discipline specific, critical thinking exercises.

Transforming student thinking through General Education Requirements (Dr. Carr)

Cris Hollingsworth

Undergraduate critical thinking and writing

Critical thinking and writing (Dr. Poston)
Transforming student thinking Learning to write to learn (Dr. McMillan)

Brenda Litchfield

Academic Contracts

Writing Coaches (Dr. Everest)
Self-regulated learning (Dr. McMillan)

Jack Dempsey

Collaborative learning and teaching

Using Blogs (Dr. Strange)
Using Projects (Dr. Strange)

The QEP Development Committee met on September 12, 2011 to discuss the proposals that had been submitted for consideration (one of the authors, Cris Hollingsworth, chose not to participate).   The three proposals submitted were: (a) Critical Thinking: Making a Difference in Student Learning and Lives; (b) Collaborative Teaching and Learning at USA; and (c) Academic Learning Contracts: Active Learning and Meaningful Engagement.  The QEP Development Committee sent all three proposals to the Leadership Team with evaluative comments and suggestions.

 

Proposal Summaries and Evaluative Comments

 

Critical Thinking: Making a Difference in Student Learning and Lives. This proposal consisted of a critical thinking assignment in the First Year Experience course and in eight upper-level courses. The proposal suggested improving students鈥 critical thinking skills by training faculty to develop and implement critical thinking assignments in both lower and upper-division courses.  Direct assessment of student learning outcomes were to be based on the AAC&U VALUES critical thinking rubric and the California Critical Thinking tests with indirect assessment relying on student surveys and assessment of critical thinking disposition.  Faculty would require professional development prior to the initiation of the project. The five-year estimated budget was $532,100.

Committee comments regarding this proposal were to:

  • Add a critical thinking assignment for transfer students, probably as part of a transfer experience course
  • Add a graduate student critical thinking component
  • Consider using graduate students as mentors to help with undergraduate critical thinking assignments, if appropriate, at selected colleges
  • Develop distinct critical thinking assignments for online courses/programs
  • Design assessment of the faculty training component
  • Ensure faculty are willing to submit assessment results following training
  • Review recent research published in Academically Adrift: Limited Learning on College Campuses

Collaborative Teaching and Learning at USA.  This proposal included four primary components.  The first component consisted of the creation of an infrastructure that    allowed for informal collaborative sharing of course materials such as application activities on Sakai CLE project sites and the creation of a University institutional repository.  Collegial teams would be developed to foster course content innovation. Active engagement of students would be an instructional focus of the project. Furthermore, an evidence-based collaborative learning model, Team-Based    Learning, would be implemented throughout the curriculum.  Direct assessment of learning outcomes would be tied to models of collaborative learning with a focus on pre- and post-measures of student success.  The estimated budget for the project was $300,000 plus faculty incentives and the salary costs of a QEP Director.

            Committee comments regarding this proposal were to:

  • Provide additional details regarding pilot development
  • Further develop assessment processes related to learning outcomes
  • Include incentives in the overall budget
  • Include critical thinking as teaching objects
  • Review recent research published in Academically Adrift: Limited Learning on College Campuses


The committee felt this project could also help improve the quality and consistency of adjunct faculty.  

 

Academic Learning Contracts: Active Learning and Meaningful Engagement.  This proposal involved using academic learning contracts to help students increase their knowledge, skills, and abilities in a subject area by choosing from a variety of assignments.  Faculty members were to be trained through workshops or departmental sessions including online instructional modules.  Direct assessment of learning outcomes would be accomplished by comparing contract and non-contract course outcomes and by indirect assessment of student and faculty experiences.  The estimated budget was $200,000 plus faculty incentives and the cost of a QEP Coordinator.

Committee comments regarding this proposal were to:

  • Develop further details regarding the pilot and implementation schedule
  • Assure this project would not be in conflict with accreditation agency expectations
  • Consider incorporation of this project within the collaborative learning project
  • Review recent research published in Academically Adrift: Limited Learning on College Campuses

            
During the proposal evaluation phase of topic development, the QEP Development Committee reviewed student data collected regarding critical thinking (Level of Academic Challenge) and collaboration (Active and Collaborative Learning) via the National Survey of Student Engagement (NSSE). In Table 6, summary scores for critical thinking and collaboration were converted to a 100-point scale with higher scores indicating higher levels of perceived competence.  USA鈥檚 scores were judged by the committee to be unsatisfactory. Critical thinking scores ranged from 47.9 in 2006 to 52.1 in 2011, which were below the scores for peer institutions (UI--Urban Institutions).

 

Table 6   
Level of Academic Challenge (LAC) and Active and Collaborative Learning (ACL): Mean Comparison of USA First-Year Students and Urban Institution (UI) First Year Students

 

2006

2009

2011

USA

UI

USA

UI

USA

UI

Level of Academic Challenge

47.9

49.9***

51.1

53.0*

52.1

53.0

Active and Collaborative Learning

36.5

39.8***

43.2

43.3

39.8

42.0*

*p<.05; **p<.01; ***p.001
Note:   Mean is weighted by gender and enrollment status (and size for comparisons)

In Table 7, NSSE mean scores for items measuring collaboration and critical thinking are presented using a 1 - 4 scale with 1 labeled as very little, 2 as some, 3 as quite a bit, and 4 as very much.  As the value increases, so does the level of perceived competence.  Once again, USA鈥檚 scores were lowered than desired.  In 2006, all but one score was labeled 鈥渘ever鈥 or 鈥渟ome鈥 (1 or 2), in 2009, all but two scores were labeled 鈥渘ever鈥 or 鈥渟ome鈥 (1 or 2), and, in 2011, all but four scores were labeled 鈥渘ever鈥 or 鈥渟ome鈥 (1 or 2).  There were no scores labeled 鈥渧ery often鈥 (4). Moreover, 60% of USA scores for years 2006, 2009, and 2011 were lower than peer institutions (UI--Urban Institutions).

Table 7
Student Perceptions of Their Experiences at USA:  USA First Year Students Compared with Urban Institution (UI) First Year Students

COLLABORATION

 

2006

2009

2011

USA

UI

USA

UI

USA

UI

Worked with classmates outside of class to prepare class assignments.1

1.87

2.14***

2.35

2.34

2.22

2.32*

Tutored or taught other students (paid or voluntary)1

1.68

1.56**

1.64

1.65

1.65

1.60

Working effectively with others3

2.75

2.82

2.91

2.97

3.01

2.95

CRITICAL THINKING

 

2006

2009

2011

USA

UI

USA

UI

USA

UI

Analyzing the basic elements of an idea, experience, or theory, such as examining a particular case or situation in depth and considering its components.

2.96

2.98

3.03

3.13

3.13

3.17

Synthesizing and organizing ideas, information, or experiences into new, more complex interpretations and relationships.2

2.72

2.77

2.82

2.91

2.89

2.92

Making judgments about the value of info., arguments, or methods, such as examining how others gathered and interpreted data and assessing the soundness of their conclusions.2

2.81

2.79

2.92

2.96

2.98

2.96

Applying theories or concepts to practical problems or in new situations.2

2.95

2.90

2.95

3.08*

3.03

3.07

Thinking critically and analytically3

3.12

3.07

3.24

3.17

3.27

3.22

Solving complex real-world problems3

2.47

2.47

2.63

2.68

2.65

2.65

1During the current school year, about how often have you done each of the following?  1 = never; 2 = sometimes; 3 = often; 4 = very often
2During the current school year, how much has your coursework emphasized the following mental activities? 1=Very little; 2=Some; 3=Quite a bit; 4=Very much
3To what extent does your institution emphasize each of the following?  1 = very little; 2 = some; 3 = quite a bit; 4 = very much
*p<.05; **p<.01; ***p.001
Note:   Mean is weighted by gender and enrollment status (and size for comparisons)

The QEP Development Committee also reviewed student data pertaining to retention.  The five-year summary of one-year retention rates of USA freshman cohorts (2006-2011) showed a declining rate of retention from 70% in 2006 to 66% in 2011 (Table 8).

Table 8 
Freshman Cohort Retention Rate Summary*

Cohort

Count

#1 Year

#1 Year Rate

Fall 2006

1,317

926

70%

Fall 2007

1,418

950

67%

Fall 2008

1.495

998

67%

Fall 2009

1,711

1,127

66%

Fall 2010

1,654

1,082

65%

Fall 2011

1,825

1,202

66%

*USA Office of Institutional Research, Planning and Assessment 

Topic Selection

Upon conclusion of the proposal evaluation phase, the QEP Development Committee recommended Collaborative Learning and Critical Thinking to the Leadership Team as potential QEP topics.  The Leadership Team chose to integrate the two topics.  Pros and cons of each topic were discussed at great length.  Input was solicited from university constituents and external sources.  The Leadership Team chose to integrate the two topics.  Based on this feedback, and after a review of collaborative teaching and learning best practices and literature, an instructional strategy called Team-Based Learning (TBL) was selected by the Leadership to be used by instructors to facilitate the achievement of project goals.  Team-Based Learning, a form of collaborative learning, utilizes a specific sequence of individual work, group work, and immediate feedback to create a motivational framework in which students are accountable to the instructor and to the other members of their team for coming to class prepared and for contributing to discussion (Michaelsen & Sweet, 2008). Team-Based Learning has been shown to be an effective instructional strategy in numerous studies relative to critical thinking, content acquisition, and engagement (Clark, Nguyen, Bray, & Levine, 2008; Haberyan, 2007; Thompson, Schneider, Haidet, Levine, McMahon, Perkowski, & Richards, 2007).  Additional information pertaining to TBL may be found in section VI, literature review and best practices, of this document.

The QEP Implementation Team

During the spring of 2012, the Leadership Team appointed a QEP Implementation Team, which was charged with the responsibility of beginning the project including the selection of a TBL consultant and QEP Director.

The Implementation Team solicited faculty participants, selected a TBL consultant, and coordinated a summer TBL workshop.  Dr. Larry Michaelsen was selected as the TBL consultant.  Dr. Michaelsen is Professor of Management at the University of Central Missouri and is the David Ross Boyd Professor Emeritus at the University of Oklahoma; he is also a Carnegie Scholar, a Fulbright Senior Scholar, and former Editor of the Journal of Management Education. Dr. Michaelsen is one of the original developers of Team-Based Learning and is viewed as a central figure of its worldwide dissemination. 

The Implementation Team also conducted a national search and made a recommendation to the Leadership Team for a Director of the QEP.  Once again, feedback was solicited from the community, faculty, staff, and external experts regarding the requisite credentials for this position.  There were two finalists selected to visit the USA campus.  During their visits, each candidate interviewed with the Implementation Team, the Senior Vice-President for Academic Affairs, and faculty. Each presented a vision for the project and answered questions.  Applicants also conducted a one hour open-campus presentation regarding Collaborative Team-Based Learning.  

Dr. Ronald A. Styron was selected as the QEP Director.  Dr. Styron is a former P-12 teacher and administrator, college administrator, and Professor of Educational Leadership and Research.  Dr. Styron has a wealth of P-20 experience developing, coordinating, and administering professional development programs, as well as teaching, publishing, presenting, and implementing change initiatives.  He also received and administered several large grants aimed at faculty development. 

The QEP Implementation Team included representation from a multiple constituencies.  The original team members are listed in Table 9.

Table 9
QEP Implementation Team

Member

Title

Averitt, Jennifer

Staff and Alumni, Engineering

Broome, Barbara

Associate Dean, Nursing

Carr, Nicole

Director, Student Academic Success

Carr, Phillip

Professor, Sociology and Anthropology

Coleman, Robert

Associate Dean, Arts and Sciences

Dempsey, Jack

Director, Center for Innovation in Learning

Estis, Julie

Associate Professor, Allied Health

Exline, Joan (Chair)

Associate Vice-President

Gunn, Jennie

Associate Professor, Nursing

Keasler, Diane

Instructor, Nursing

LeDoux, Susan

Associate Dean, Medicine

Maes, Jeanne

Professor, Management

Martin, Cecelia

Director of Assessment

Maulding, Wanda

Associate Professor, College of Education and Professional Studies

McKinney, Dawn

Senior Instructor, School of Computing

Millner, Vaughn

Dean, Continuing Education

Thomas, Crystal

Assistant to the Dean, Arts and Sciences

Vesoulis, Michael

Student

Weldy, Teresa

Assistant Professor, Management

Windom, Charlotte

Instructor, Development Studies

 

The QEP Advisory Council

During the summer of 2012, the QEP Implementation Team was reconfigured to include additional constituents, particularly faculty participating in the Team-Based Learning pilot, and was renamed as Advisory Council.  The purpose of the USA QEP Advisory Council is to support and guide the project.  Council members serve as advocates of the QEP and make recommendations for continuous improvement of all aspects of the QEP, including, but not limited to, implementation, assessment, planning, and budgeting of the plan.

The QEP Advisory Council has held regularly-scheduled monthly meetings to provide feedback aimed at shaping implementation practices, evaluation of success, and determination of project as the QEP is employed.  The QEP Director presents regular updates and reports regarding the progression of activities leading to the attainment of the project鈥檚 goals and objectives. QEP Advisory Council membership includes students, faculty, administrators, community members, and alumni.  The initial membership included the following personnel (Table 10):

Table 10
QEP Advisory Council

Member

Title

Averitt, Jennifer

Staff and Alumni, Engineering

Bonner, Jessica

Graduate Student

Brooks, Elizabeth

Undergraduate Student

Carr, Nicole

Director, Student Academic Success

Carr, Philip

Professor, Sociology and Anthropology

Chastain, Parker

Student Council Representative

Coleman, Robert

Associate Dean, Arts and Sciences

Conn, Erika

Undergraduate Student

Cwikla, Julie

Dir, Center for Integrative Studies in Sci, Tech, Engin., & Math.

Daigle, Roy

Community Representative and former School of Computing Associate Dean

Dempsey, Jack

Director, Center for Innovation in Learning

Estis, Julie

Associate Professor, Allied Health

Gard, Anthony

Associate Dean, Medicine

Gibbs, Diane

Associate Professor, Visual Arts

Guest, Charlie

Interim VP, Institutional Research, Planning and Assessment

Gunn, Jennie

Associate Professor, Nursing

Helminger, Paul

Community Representative and former Physics professor

Hollinger, Adrian

Graduate Student

Jerkins, Emily

Undergraduate Student

Landry, Jeff

Professor, School of Computing

Litchfield, Brenda

Professor, Professional Studies

Ledoux, Susan

Associate Dean, Medicine

Maes, Jeanne

Professor, Management

Martin, Cecelia

Director of Assessment

Maulding, Wanda

Interim Assistant Vice President

McKinney, Dawn

Senior Instructor, School of Computing

Roombos, Melissa

Alumni Representative

Ronald A. Styron, Jr. (Chair)

QEP Director

Thomas, Crystal

Assistant to the Dean, Arts and Sciences

Threadgill, Tara

Graduate Student

Wilkerson, Trent

Undergraduate Student

 

Focusing Topic Selection

Institutional data on success rates regarding student learning outcomes, critical thinking, collaboration, engagement, and retention indicated that campus-wide improvement was desirable. Initially, the implementation began with professors from across campus representing numerous disciplines.  Upon further consideration by the QEP Advisory Council, it was determined that the QEP should narrow its focus to allow for concentrated effort in a set of courses that could benefit the most from the implementation of the project.  It was determined by the Advisory Council that focusing on STEM classes would be the most beneficial to the university. 

The decision to focus the QEP on STEM disciplines was based on data contained in the listing of USA鈥檚 top 25 most difficult courses as determined by lack of student success (as measured by the number of D鈥檚, F鈥檚, WD鈥檚, and D/F/WD鈥檚). In the 2010 鈥 2012 report, STEM classes represented 84% (21) of the 25 courses listed. 

There is a substantial body of research (Johnson, Johnson, & Stanne, 2000; Kalaian & Kasim, 2009; Prince, 2004; Project Kaleidoscope, 2012; Springer, Stanne, & Danovan, 1999) indicating that when compared with lecture-based instruction, all forms of small-group learning methods, including Team-Based Learning, have a positive impact on student achievement, attitude, and persistence in STEM courses.   Subsequently, it appears STEM courses would be a good fit with the project.

Furthermore, as presented in Engaged to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics by the President鈥檚 Council on Science and Technology (2012), the United States is facing a critical shortage of STEM professionals. If the country is to remain competitive in a global economy, the U.S. will need approximately 1 million more STEM professionals over the next decade than the country is currently producing.  To compensate for this shortage, the U. S. will need to increase the number of students who receive undergraduate STEM degrees by 34% over current rates. Increasing the rate of student success in STEM courses at USA will help address this national concern. 

The President鈥檚 Council also reports a need for STEM professors to reduce their utilization of traditional instructional techniques in STEM courses.  The President鈥檚 Council (2012) stated:

Better teaching methods are needed by university faculty to make courses more inspiring and relevant.  While traditional teaching methods have trained many STEM professionals, including most of the current STEM workforce, a large and growing body of research indicates that STEM education can be substantially improved through a diversification of teaching methods ( p.1).

Team-Based Learning is a non-traditional form of instructional delivery, which has the potential to improve learning outcomes and increase student engagement in STEM disciplines.

The STEM courses at USA identified for QEP inclusion are accounting, anthropology, biology, biomedical science, chemistry, computing, engineering, geology, mathematics, mathematics/science education, nursing, physical geography, physician assistant studies, physics and statistics (Figure 1).


Macintosh HD:Users:j00462234:Desktop:STEM courses.pdf

Figure 1.          Connection between STEM subjects, Critical Thinking, Collaboration,
                         Engagement, Team-Based Learning and STEM career readiness.

 

 

LITERATURE REVIEW AND BEST PRACTICES 

Introduction

The purpose of this section of the QEP report is to establish a conceptual framework for the project along with a review of related literature.  The topics addressed in the conceptual framework include social learning theory, social interdependence, and constructivism, which provide a theoretical frame for the related literature. The literature review contained in this portion of the document is related to STEM, TBL, critical thinking, collaboration, engagement, and student retention. 

Conceptual Framework

Social Learning Theory

As stated by Alfred Bandura (1977) in his text regarding social learning:

Learning would be exceedingly laborious, not to mention hazardous, if people had to rely solely on the effects of their own actions to inform them what to do. Fortunately, most human behavior is learned observationally through modeling: from observing    others one forms an idea of how new behaviors are performed, and on later occasions this coded information serves as a guide for action (p. 22).

The implication for educators is creating an environment whereby students may observe the behavior of others framed by instructional practices.  One of these practices is group work, or working in teams.

Since social learning theory is predicated on the concept of group interaction, many educational scholars have developed practices based on teaming.  These practices include cooperative and collaborative learning strategies.  Both practices require student cooperation and most importantly, meaningful student engagement based on active participation in class activities (Smith, Sheppard, Johnson, & Johnson, 2005). 

Social Interdependence

Life outside the academic enterprise is heavily group centered. An education that does not prepare students to work successfully in a group is inappropriate and flawed (Cohen & Bailey, 1997). Accordingly, it is useful to consider an organization鈥檚 environment from the perspective of social interdependence. Social interdependence exists when 鈥渢he accomplishment of each individual鈥檚 goals is affected by the actions of others鈥 (Johnson, Johnson, & Smith, 2007, p. 16). Having its foundation in Gestalt psychology (Lewin, 1935), the notion is that common goals addressed in a structured social atmosphere can result in interdependence among members resulting in a group becoming a 鈥渄ynamic whole.鈥

Accordingly, the process of positive social interdependence depends on an environment of cooperation. By contrast, negative interdependence is encouraged when individual competition is the norm (Deutsch, 1949). Although student collaborative learning and faculty collaborative content development is being used in postsecondary institutions in every part of the world, an
environment of positive social interdependence is still the exception to traditional practices in most universities, including ours.

Constructivism

Bruner (1986) asserted, 鈥渕ost learning in most settings is a communal activity鈥 (p. 127).  Constructivism theory is based on the construction of knowledge.  Knowledge is created in a group setting because the collective activities of students established a culture that serves as a template for knowledge gained.  Students are also more likely to utilize higher order thinking skills to go beyond teacher-introduced knowledge when working in groups.

Literature 

Collaboration
Collaborative learning is a widely used P-20 instructional method, found in most subject areas (Johnson, Johnson, & Smith, 2007). Students who participate in CL groups exhibit higher cognitive processing, increased problem-solving skills, and more creative approaches. They also better understand the material, stay focused longer, and enjoy many other benefits that have been documented. Life outside the academic enterprise is heavily group centered. An education that does not prepare students to work successfully in a group is inappropriate and flawed (Cohen & Bailey, 1997). Research indicates that people take their writing more seriously when peers will evaluate it (Williams, He, Elger, & Schumacker, 2007).

Collaborative learning is a subset of active learning.  When students move from a passive role (listening to lecture, reading, watching a demonstration) to an active role (discussing, simulating the real thing, and interacting with others on a working team), they tend to remember a much larger percentage of material and concepts (Prince, 2004). Not only do cognitive skills increase but also the affective skills of tolerance, acceptance, and support, along with the ability to resolve conflicts constructively increase (Johnson, Johnson, & Smith, 1998).

Situating team-based projects within a curriculum can be challenging. In the 鈥渞eal world,鈥 teams produce work that is difficult for one person to do alone. This requires selecting more complex learning tasks and focusing more of the instructor鈥檚 time on cultivating problem solving than lecturing. Fortunately, technology exists that makes it relatively easy to record lectures (e.g., video podcasts) and make them available for students to view on demand鈥攖hus freeing class time for instructors to interact with teams as an expert. A key obstacle to this kind of innovation is that faculty may be reticent to release 鈥渃ontrol鈥 of class time to students because a broadcast model of lecture is the most comfortable model of teaching. Even so, the effectiveness of the lecture model has been under attack for several decades (McKeachie, 1990).

Engaged professionals who collaborate in learning teams hold themselves to a higher standard, improve their practice, and lift student achievement (Rich, 2010). As Smith and MacGregor stated (1992), 鈥渢eachers who use collaborative learning approaches tend to think of themselves less as expert transmitters of knowledge to students, and more as expert designers of intellectual experiences for students鈥攁s coaches or mid-wives of a more emergent learning process.鈥

Critical Thinking
Critical thinking is the evaluation of an intellectual product (an argument, an explanation, a theory) in terms of its strengths and weaknesses (Johnson, 1992). Bloom (1956) divided the way people think into three domains: the Cognitive, Affective, and Psychomotor. The cognitive domain emphasizes intellectual outcomes.  This domain is further divided into six levels: Level 1) Knowledge, Level 2) Comprehension, Level 3) Application, Level 4) Analysis, Level 5) Synthesis, and Level 6) Evaluation.  Analysis, synthesis, and evaluation are regarded as the outcomes associated with critical thinking. Knowledge, comprehension, and application are regarded as the outcomes associated with content competencies (Duke TIP, 2012). There are keywords (verbs) associated with each level of the cognitive domain.  They are listed in Table 11.

Table 11
Bloom鈥檚 Taxonomy Action Verbs

 

CRITICAL THINKING

Taxonomy

Analysis

Synthesis

Evaluation

Bloom鈥檚 Definitions

Break down objects or ideas into simpler parts and find evidence to support generalization.

Compile component ideas into a new whole or propose alternative solutions.

Make and defend judgments based on internal evidence or external criteria.

 

 

 

 

 

 

Action Verbs

  • Analyze
  • Appraise
  • Breakdown
  • Calculate
  • Categorize
  • Compare
  • Contrast
  • Criticize
  • Diagram
  • Differentiate
  • Discriminate
  • Distinguish
  • Examine
  • Experiment
  • Identify
  • Illustrate
  • Infer
  • Model
  • Outline
  • Point out
  • Question
  • Relate
  • Select
  • Separate
  • Subdivide
  • Test
  • Arrange
  • Assemble Categorize
  • Collect
  • Combine
  • Comply
  • Compose
  • Construct
  • Create
  • Design
  • Develop
  • Devise
  • Explain
  • Formulate
  • Generate
  • Plan
  • Prepare
  • Rearrange
  • Reconstruct
  • Relate
  • Reorganize
  • Revise
  • Rewrite
  • Set up
  • Summarize
  • Tell
  • Write
  • Appraise
  • Argue
  • Assess
  • Attach
  • Choose
  • Compare
  • Conclude
  • Contrast
  • Defend
  • Describe
  • Discriminate
  • Estimate
  • Evaluate
  • Explain
  • Judge
  • Interpret
  • Relate
  • Predict
  • Rate
  • Select
  • Summarize
  • Support
  • Value

 

Team-Based Learning

Team-Based Learning is rooted in collaborative learning and is a form of active learning.  Team-Based Learning utilizes very specific instructional strategies including intentional selection and permanence of student teams, a readiness assurance process, and an empowering procedure for students to challenge answers determined by the instructor and peer evaluation.  Application activities are based on the 4 S鈥檚, Significant problems, the Same problem, students make a Specific choice in terms of solutions, and each team reports their choice Simultaneously (Michaelsen, Knight, & Fink, 2004). 

Team-Based Learning has been shown to improve student achievement by increasing student reasoning, problem-solving and critical thinking skills, encouraging more scientific thinking, and developing a deeper understanding of course content (Dunaway, 2005; Haidet, Morgan, O鈥橫alley, & Richards, 2004; Koles, Nelson, Stolfi, Parmelee, & Destephen, 2005; McInerey & Fink, 2003; Perkowski & Richards, 2007; Vasan & DeFouw, 2005; Zgheiv, Simaan, & Sabra, 2010).   Team-Based Learning strategies are based on the conceptual model called 鈥淏ackward Design鈥 (Wiggins & McTighe, 2005), which is centered on the development of sophisticated insights and abilities, reflected in varied performances and contexts. Backward Design is a technique used by instructors in which they first determine the goal (end result) of the lesson and work backward to develop specific learning activities.

Instructors who are able to discern students鈥 levels of thinking and who use those to construct knowledge, help them to develop a better understanding of content (Darling-Hammond, 1996).  TBL has been suggested to help students who seem disinterested in subject material, do not do their homework, and have difficulty understanding material. Team-Based Learning can transform traditional content with application and problem solving skills, while also developing interpersonal skills (Knight & Fink, 2002).

Courses that include TBL strategies typically involve multiple group assignments that are designed to improve learning and promote the development of self-managed learning teams 
(Michaelsen, L. K., & Sweet, M., (2008).  Learning how to learn, work, interact, and collaborate in a team is essential for success in this kind of an environment (Hills, 2001).  Required skills include listening, questioning, and persuading; members must respect each other; and there must be a spirit of helping, sharing, and participation.  Team-Based Learning strategies include real-life experiences with clear applications to course content. These strategies help students understand course concepts, allow them to work on complex intellectual tasks, and offer them the opportunity to move beyond their individual capabilities.

Team-Based Learning includes the following components:

  • The formation of teams
  • Instructors purposefully select team members
  • Team composition is diverse
  • Teams should be held strictly accountable for their products
  • Readiness Assurance
  • Students must read and prepare assigned materials
  • Individual Readiness Assurance Test called an iRAT (a test given at the end of a unit to assess individual mastery of content and readiness for application of that content)
  • Team Readiness Assurance Test called tRAT (the tRAT is identical to iRAT, but the tRAT is taken by a team immediately after the iRAT with members working on a single answer sheet)
  • Appeals (team can appeal answers with appropriate evidence)
  • Instructor gives clarifying lecture focusing on areas of weaknesses as identified by test results
  • Application
  • Students make decisions to solve discipline-based problems using cases, data, or other evidence determined by the instructor
  • Significant problem(s) must be addressed
  • Specific choice(s) must be made using course concepts (what and why)
  • All teams work on the same case/problem
  • Groups report simultaneously
  • Student feedback
  • Points distributed to team members
  • Mid semester and end of semester or as appropriate for course layout
  • Emailed to student in aggregate to protect student identity

Team-Based Learning provides an infrastructure that promotes collaborative learning, exchange of course content, collegial teaching, critical thinking, collaboration, and student engagement. Students work as collaborative teams to gain more meaningful and diverse experiences.  As described earlier in this document, courses that include Science, Technology, Engineering, and/or Mathematics were selected as a focus for this project because the global economy has "flattened" the world in terms of skills and technology. As a result, the United States is currently experiencing a chronic decline in homegrown STEM talent while becoming increasingly dependent upon foreign scholars to fill workforce and leadership roles. Action must be taken at the postsecondary level to promote the development of a citizenry with expertise in these content areas (President鈥檚 Council of Advisors on Science and Technology, 2012).  Accordingly, the purpose of TEAM USA will be to improve student learning in STEM courses by increasing student critical thinking and collaborative skills through the utilization of Team-Based Learning instructional strategies.  This will help increase student chances for successful completion of STEM degree programs and improve their ability to function effectively in the 21st century global workplace.

Team-Based Learning provides an instructional environment that is designed to allow students within classes to achieve the outcomes of increased critical thinking skills and enhanced ability to work in teams and collaborate. Students regularly work in an environment where there are stakes associated with being a good collaborator. Hence, one learning outcome that should come from properly implemented Team-Based Learning is that students should develop skills as collaborators. Done in an appropriate way, faculty and peer ratings of performance on teams should provide an accurate picture of student skill in collaboration. The regularity and stakes associated with Team-Based Learning provide faculty members and peers with more opportunities to collect evidence and form a valid and reliable assessment of student collaboration.

Team-Based Learning also provides a context that facilitates increases in critical thinking within classes.  First, discourse over answers to the team readiness assurance test provides a context for constructive controversy (Johnson & Johnson, 2009) that creates the need for critical thinking and analysis. Second, students think about and discuss solutions to applied and significant problems.  The significant number of opportunities for students to apply, create, and evaluate should lead to better critical thinking performance on classroom and QEP-wide assessments of student learning.  With critical thinking and collaboration as a focus for the TBL intervention, TBL student engagement and retention within the university and within majors should be improved.  So, along with specific learning outcomes, we should see reports of increased engagement on the NSSE and better student retention.  

The Impact of Team-Based Learning in STEM courses

Carmichael (2009) sought to examine the viability of TBL strategies when applied to large-scale biology classes. In particular, he wanted to determine if student-centered TBL methods were an effective alternative to traditional instructor-centered lectures by comparing student performance in each section.  He incorporated TBL into one of two Introductory General Biology classes averaging 200 students per class.  The first group used traditional lecture-based techniques and the second class used TBL.  Data indicated the TBL class scored higher on all tests during the semester than the traditional class, with the exception of the final exam in which students performed at comparable levels.  Grades for the TBL class were significantly higher than the lecture-based class with TBL students earning more A鈥檚 and B鈥檚 and fewer D鈥檚 and F鈥檚.  Carmichael also found TBL students responded to exam questions that included data-interpretation with significantly more accuracy than the lecture-based classroom students, indicating increased critical thinking skills. End of semester surveys also indicated students from the TBL class demonstrated more critical thinking ability than students in the lecture-based environment. Furthermore, Carmichael also found that student engagement in classroom settings was more pronounced where TBL was used and students appeared to be more inclined to ask meaningful questions in class. Student comments recorded on the instructor鈥檚 evaluation form suggested that a majority of students believe TBL improved learning of general biology.  Results also endorsed the implementation of TBL as a beneficial tool in increasing student performance and engagement in a large-enrollment undergraduate introductory science course.  

In Using Team Learning to Improve Student Retention, Kreie, Headrick, and Steiner (2007) applied TBL methods in an introductory Information Systems (IS) course with the intention of increasing student achievement and retention.  The results were compared to the same class taught with traditional lecture-based instruction.  Researchers sought to decrease not only the amount of students who dropped the course but also the number of students who stopped coming to class as the semester progressed.  Therefore, researchers鈥 final measure of retention only included the percent of students taking the final examination. Retention of the TBL students was significantly higher than that of traditionally instructed students, with 85.5% of TBL students taking the final as compared to 71.6% of students enrolled in the traditional lecture section.  Instructors noted that students were more motivated to attend class on group activity days due to team commitment. Kreie, Headrick and Steiner concluded that TBL was an advantageous pedagogy to aid in the engagement and retention of students.  

Baepler, Cotner, and Kellerman (2008) incorporated the Immediate Feedback Assessment Technique (IF-AT) into large-enrollment introductory general biology classes. Aside from producing immediate feedback, researchers expected IF-ATs to highlight misconceptions for correction and promote group discussion.  Course structure incorporated mini-lectures coupled with group assignments that included IF-AT activities.  Groups were randomly set at the beginning of the semester, and students remained in those groups for the entirety of the course. Student perceptions regarding usefulness of the IF-AT were measured using survey instruments.  Researchers were especially interested in the reception of IF-AT incorporation by female students.  Data indicated that a compelling portion of students thought that IF-AT activities enhanced exam performance and recognition of misconceptions about subject matter. While overall responses from both genders remained positive, survey responses indicated that female students appreciated immediate feedback as an advantageous comprehension tool significantly more than male students. Consequently, Baepler, Cotner, and Kellerman noted that techniques such as the IF-AT may provide a way to enhance engagement and thus retention of females in science disciplines, a consequential and persistent problem in the sciences.  Results also indicated that use of the IF-ATs facilitated constructive group discussion, provided crucial corrective feedback, and promoted overall student collaboration.

In a study conducted in 2010, Gomez found that traditional TBL methods could successfully be implemented in hybrid Information Systems (IS) courses that incorporated both face-to-face and online components. Students participating in the course expressed improved enjoyment and interest in course material.  As indicated by student survey data, students believed that TBL methods (particularly iRATs) resulted in increased knowledge of course material.  While this study did not measure student performance, Gomez did find a significant correlation between student perceptions of 鈥渕otivation and enjoyment鈥 to perceptions about the quality of their learning experience. Gomez concluded that 鈥渢eam activities may help students enjoy more what they need to learn, eventually achieving higher learning鈥 (Gomez, 2010, p. 389).

Citing inadequacies in the traditional lecture-based approach to organic chemistry instruction and using a slightly modified TBL method, Dinan and Frydrchowski (1995) evaluated the effectiveness of team-based learning on an introductory organic chemistry course. Students were placed in teams of five or six, and the groups remained the same throughout the semester. Instructors utilized other aspects of TBL including mini-lectures, the appeal process, and readiness assurance.  Instructors observed that student response was very positive in the team-learning atmosphere as students arrived to class early and began working within their groups immediately.  Students also needed little to no coaxing to ask questions or engage in discussion.  Preparation, participation, and attendance by students all increased within the team-learning environment. At the end of the semester, surveys indicated that an overwhelming majority of students felt accountable to their team to be prepared for and attend class sessions.   Instructors also observed that with increased group exposure, individual performance increased for all ethnicities. While data were limited, minority students enjoyed a higher success rate as opposed to traditional lecture methods, with 100% minority student retention and 80%of those students making a B average or above in the course.  In addition to increased student participation and preparation, more chapter content was covered using the TBL method than content covered in past lecture-based sections.  Eighty-four percent of students in the class responded that team learning was a successful technique for learning organic chemistry.

Drummond (2012) examined the effects of using Team-Based Learning techniques on students鈥 critical thinking skills in an engineering entrepreneurship class.  He analyzed three semesters of student data using the Critical Thinking Skills (CTS) rubric created by Washington State University as a measure of improvement. During one semester TBL strategies were used, and during the other two semesters traditional lecture-based strategies were used. Data generated by CTS activities coupled with instructor observations indicated a correlation between TBL and increased critical thinking skills. The CTS category measuring the ability to develop an individual hypothesis showed significant improvement among TBL students versus students in the non-TBL classes. Drummond noted this particular aspect of CTS measurement was used to exemplify typical results across all dimensions due to the large amounts of data collected.   
Other outcomes of TBL implementation included improved student participation and class preparation.  Data showed that in the non-TBL environment student participation peaked at 25%; however, in TBL classes, student participation averaged approximately 70%. Instructors also observed increased participation among English as a Second Language (ESL) students, along with other students who were usually reluctant to participate in class activities.

Banfield, Fagan, and Janes (2012) conducted a study for registered nurses for the Registered Nurses Professional Development Centre鈥檚 Critical Care Nursing Program.  Instructors implemented TBL methods in their classrooms to measure its impact on critical-thinking skills among their students. Data indicated TBL helped provide effective preparation for real-world experiences by engaging students in critical thinking otherwise not possible during traditional lecture.  Through their comments, students expressed a belief that TBL methods resulted in greater retention of course material while also providing greater opportunities to exercise critical thinking.  There was no significant difference in student performance across methods, but instructors did observe higher levels of student engagement and team-centered problem solving.  

In 2009 Kalaiam and Kasim conducted a meta-analysis of 193 studies to determine the effectiveness of group-based instruction (cooperative learning, collaborative learning, problem-based learning, team-based learning, peer learning, and inquiry-based learning) as compared to traditional lecture-based instruction in college STEM classes.  The purpose of the project was to produce scientific evidence that could help determine whether the use of small-group learning strategies was more effective than lectures for improving student learning and persistence along with attitudes toward STEM subjects.  Results indicated that in varied degrees, all forms of small-group learning methods had a positive impact on student achievement, attitude, and persistence.  These findings are consistent with other meta-analytic findings (Springer, Stanne, & Danovan,1999; Johnson, Johnson, and Stanne, 2000) about the effectiveness of small-group learning methods in increasing students鈥 achievement in STEM college classrooms.

Summary

Collaborative learning is grounded in social learning theory and constructivism.  Collaborative learning serves as a conceptual umbrella encompassing many instructional strategies including cooperative and active learning. Following this conceptual development, a subset of active learning are related pedagogical techniques such as problem-based learning, inquiry learning, and team-based learning.

Team-Based Learning utilizes very specific instructional strategies including intentional selection and permanence of student teams, a readiness assurance process, grade appeals and peer evaluation.  Application activities are based on the 4 S鈥檚, Significant problems, the Same problem, students make a Specific choice in terms of solutions, and each team reports their choice Simultaneously (Michaelsen, Knight, & Fink,  2004). 

Team-Based Learning has been widely implemented in the health sciences (medicine, nursing and pharmacy) and business.  In these disciplines TBL has been established as an effective technique for improving content acquisition, critical thinking, and collaboration (Parmelee, 2010).  While TBL has not been implemented as widely in other STEM disciplines, several studies have shown that TBL improves these learning outcomes in such courses.  This QEP will build on that foundation and will attempt to improve learning outcomes across a wide range of STEM disciplines.

 

DESIRED STUDENT LEARNING OUTCOMES

It is hypothesized that the integration of higher order thinking and collaboration will lead to improved student learning outcomes in STEM courses.  Learning and teaching in STEM disciplines are often misaligned and weakly assessed (Bransford, Brown, & Cocking, 1999). Nobel Prize winning physicist Carl Wieman (2010) notes that even though those teaching sciences believe that students need to learn concepts, they often only teach and assess factual information.  Further, he claims that they often presume teaching factual information will somehow translate into students learning concepts. Wieman, who has taken on the mission of improving science learning in higher education, claims that an explicit focus on conceptual aspects of learning is what is necessary to improve science learning. Team-Based Learning has shown to be effective in increasing conceptual learning and in providing a context for higher order learning outcomes like those suggested in Bloom鈥檚 Taxonomy (e.g., application, evaluation, and synthesis).  Thus, the most important goal in this QEP is for students participating in TBL classes to go beyond the memorization of facts, to learn scientific concepts, and to be able to apply them in solving problems.   

Andreas Schleicher from the Organization for Economic Development and Cooperation notes that, 鈥淭he skills that are easiest to teach and test are also the skills that are easiest to digitize, automate and outsource.鈥  Schleicher notes that jobs with routine cognitive skills such as memorizing and carrying out of simple procedures are disappearing and that the jobs of the 21st Century will require non-routine cognitive skills that involve the ability to think critically and creatively.  Consequently, student outcomes in the 21st Century need to focus on the acquisition of critical thinking skills.  Schleicher also points out that collaboration is also a key skill needed for the 21st Century.  He believes that individuals who know how to collaborate and communicate with others both in person and using the new digital technologies are more apt to be successful in a dynamic, ever-changing world.

The student learning outcomes in this project mirror Schleicher鈥檚 concerns to provide learning contexts that support both critical thinking and collaboration.  Team-Based Learning was selected as the instructional strategy because it is an evidenced-based collaborative model designed to improve student achievement, critical thinking, collaboration, and engagement.  These factors were targeted for improvement after a careful review of student data as described in Section III of this document.  Furthermore, students who experience success in their courses through enhanced learning, improved critical thinking, collaboration, and engagement are more likely to persist in STEM degree programs.

Student Learning, Project Outcomes

Student learning is defined as improvements in knowledge, skills, attitudes, competencies, and habits of mind (Jankowski & Provezis, 2011).  Project outcomes are comprehensive and represent the broadest level of outcomes contained within the project.  Project outcomes serve as the framework for both program and common courses student learning outcomes.

Project Outcome 1:    Students will achieve higher mastery levels of course content and real-world application of the content.
Project Outcome 2:    Students will develop higher levels of critical thinking skills.
Project Outcome 3:    Students will develop higher levels of collaborative skills.
Project Outcome 4:    Students will have higher levels of engagement
Project Outcome 5:    Students will increase persistence in STEM courses

Student Learning, Course Outcomes

Student learning outcomes are specific to each STEM class and tailored to the curriculum of individual classes.  Student learning outcomes are design around course content and higher order thinking skills as defined by Bloom (1956).  In each STEM course, there will be no less than 3-5 student learning outcomes that incorporate higher-level thinking skills (Analysis, Synthesis, and Evaluation).  Student learning outcomes addressing knowledge, comprehension, and application will be utilized as appropriate.  Student learning outcomes will be connected to a specific assessment.  Professors will also establish assessment targets for student learning outcomes that represent appropriate levels of mastery.

The classes scheduled for project inclusion during the first three years of the project include accounting, anthropology, biology, biomedical sciences, cell biology and neuroscience, chemical and biomolecular engineering, chemistry, civil engineering, computer and information science, earth science, electrical and computer engineering, geology, geography, mechanical engineering, nursing, pathology, pharmacology, physician assistant studies, speech pathology, and statistics.  Additional courses from mathematics, mathematics education, physics, and science education will be added for implementation during years 4 and 5.  The following student learning outcomes, assessments, and assessment targets (Table 12- Table 17) are examples of those included in all QEP STEM classes.

Table 12
Beginning Programming (Computing)--CIS 115                     
Instructor: Mrs. Dawn McKinney

Student Learning Outcomes

Course Assessments

Assessment Targets

  1. Students will demonstrate the use of relational and logical operators to create correctly designed conditional expressions.

Test questions
Programming assignments

80% of students will get this 80% correct.

  1. Students will construct procedures to carry out tasks of a program including passing appropriate parameters.

Programming Assignments

70% of students will get this 80% correct.

  1. Students will differentiate among syntax (compile-time), execution (run-time), and logical (output) errors.

Test questions

70% of students will get this 80% correct.

4. Students will trace and analyze an algorithm for expected behavior.

Test questions

70% of students will get this 80% correct.

5. Students will test and debug simple programs to produce a correct solution.

Programming Assignments

70% of students will get this 80% correct.

6. Students will write programs to solve problems involving simple operations on single-dimension arrays.

Programming Assignments

70% of students will get this 70% correct.

 

Table 13
Information Systems/Information Technology (Computing)--ISC/ITE 475
Instructor: Dr. Jeff Landry

Student Learning Outcomes

Course Assessments

Assessment Targets

1. Students will apply project management principles and techniques for managing in multiple Project Management Body of Knowledge (PMBOK) areas.

Homework problems and test questions requiring that students estimate, evaluate, and schedule the interrelated activities of an IS project using precedence diagramming and the critical path method

80% of students will achieve an 80% correctness score.

2. Students will solve project management problems using a project management software tool.

Homework problems requiring students to use a project management software tool for scheduling and cost analysis

80% of students will achieve an 80% correctness score.

3. Students will solve project management problems, such as project comparison, selection and estimation using electronic spreadsheet software.

Test questions that ask students to make project selection decisions using multiple techniques
Perform a case analysis that integrates multiple selection techniques and requires students to justify decisions.
Use a spreadsheet tool to analyze projects using financial and weighted scoring methods.

80% of students will achieve 80% correctness on test questions.
80% or more of students will make an acceptable to strong defense of a selection decision using multiple decision criteria, as graded by a 4-pt. rubric.
80% of students will make a correct project selection decision backed up by correct calculations.

 

Table 14
Issues in Biomedical Science--BMD 493                                           
Instructor: Dr. Cindy Stanfield

Student Learning Outcomes

Course Assessments

Assessment Targets

1. Students will be able to explain the basic communication process within and between neurons, including electrical signals occurring across the cell membrane.

Team activities will include analysis of electrical recordings of neuronal activity. There will also be similar questions on the first RAT.

Seventy-five percent of all students will score 80 percent or better on the specific questions of the iRAT and 80% of the teams will score 90% or above on the specific questions of the tRAT.

2. Students will be able to describe the major classes of neurotransmitters and explain the more common mechanisms by neurotransmitters they produce responses in the post-synaptic cell.

Team activities will include drawing a neural circuit and include neurotransmitter and receptors. Teams will assess each other鈥檚 work.

Teams will score 80% or above on their drawings.

3. Students will be able to compare the different sensory systems with respect to sensory transduction, sensory pathways, and sensory coding.

RATs 5 and 6 will have embedded questions to test this knowledge.

Seventy-five percent of all students will score 80 %or better on the specific questions of the iRAT and 80% of the teams will score 90% or above on the specific questions of the tRAT.

4. Students will be able to explain motor control and compare the lower motor neuron controls, upper motor neuron controls, and supportive input to motor control.

RAT 6 will have embedded questions to test this knowledge.

Seventy-five percent of all students will score 80 percent or better on the specific questions of the iRAT and 80% of the teams will score 90% or above on the specific questions of the tRAT.

5. Students will be able to describe the anatomy of the autonomic nervous system and discriminate between effects of the parasympathetic and sympathetic branches.

RAT 7 will have embedded questions to test this knowledge.

Seventy-five percent of all students will score 80 percent or better on the specific questions of the iRAT and 80% of the teams will score 90% or above on the specific questions of the tRAT.

6. Students will be able to identify structures in brain specimens.

Students will spend a minimum of 10 hours in the neuroanatomical lab, and each team will turn in a journal with pictures of brain specimens to which they apply appropriate labels.

As a team assignment, 100% of the teams will complete the journal, and 80% of students are expected to participate fully.

7. Students will be able to analyze higher brain functions, and be able to distinguish between known functions and theoretical functions.

Students will perform activities related to Sacks book on brain cases and do activities prior to reading text. Then iRATs 7 and 8 will encompass higher brain function.

Seventy-five percent of groups will score 80% or above on activities, and 75% of all students will score 80 percent or better on specific questions of the iRATs.

8. Students will be able to evaluate case studies.

Questions on the iRAT and exam 3 will relate to case studies.

Seventy-five percent of all students will score 80 percent or better on the specific questions of the iRAT and 80% of the teams will score 90% or above on the specific questions of the tRAT.

 

Table 15
Landscaping Patterns and Processes (Geography)鈥擥EO 105      
Instructor: Dr. Miriam Fearn

Student Learning Outcomes

Course Assessments

Assessment Targets

1. Students will develop, describe, explain, and defend sound environmental decisions based on potential disasters related to tectonic plate boundaries (ex. volcanoes, earthquakes).

Questions on midterm exam related to disaster planning

A minimum of 80% of students will answer questions correctly.

  1. Students will develop, explain, and defend sound environmental decisions based on fluvial processes (ex: flooding/ erosion/water resources).

Questions on final exam related to disaster planning

A minimum of 80% of students will answer questions correctly.

  1. Students will develop, explain, and defend sound environmental decisions related to coastal processes (ex. beach nourishment, coastal development).

Questions on midterm exam related to disaster planning

A minimum of 80% of students will answer questions correctly.

 

Table 16
Physical Anthropology (Anthropology)鈥擜N 210                               
Instructor: Dr. Philip Carr

Student Learning Outcomes

Summative Assessment

Assessment Target

1. Students will understand basic scientific principles of evolution.

Week 6 tRAT and individual lab exit quiz

A minimum of 90% of students will score a 90% or higher on these assessments.

2. Students will understand basic scientific principles of genetics.

Week 5 tRAT and individual lab exit quiz

A minimum of 90% of students will score 90% or higher on these assessments.

3. Students will categorize, and describe, the species Homo sapiens within the broader biological classification of living organisms, especially primates.

Week 7 tRAT and individual lab exit quiz

A minimum of 90% of students will score 90% or higher on these assessments.

4.  Students will examine selected scientific notions regarding the biological origins and development of the human species and the fossil discoveries on which they are based.

Weeks 10 and 12 tRAT and individual lab exit quizzes

A minimum of 90% of students will score 90% or higher on these assessments.

5. Students will be able to identify basic elements of the human skeleton and conduct basic forensic anthropological analyses.

Weeks 2, 15, and 16 tRAT and individual lab exit quizzes

A minimum of 90% of students will score 90% or higher on these assessments.

6. Students will compose an essay applying critical thinking skills in appraising the principles on which race classifications are based, explain the evolutionary concept of cline as an explanation of human skin color differences, and examine race as a biological and social construct.

Critical Thinking Project 鈥 鈥淒oes Race Exist?鈥

A minimum of 90% of students will have an average score of 3.5 or higher on the project rubric.

 

Table 17
Structural Analysis (Engineering)鈥擟E 384                                       
Instructor: Dr. John Cleary

Student Learning Outcomes

Summative Assessment

Assessment Target

1. Students will calculate the reaction forces for determinate structures such as beams, trusses, and simple frames.

Questions from Exams I and II, Homework problems, group work problems, and RAT鈥檚

Overall average for all students will be at or above 80%.

2. Students will analyze determinate trusses using the method of joints and the method of sections.

Questions from Exam II, Homework problems, group work problems, and RAT鈥檚

Overall average for all students will be at or above 75%.

3.  Students will calculate internal forces in beams and construct the shear and bending moment diagrams.

Questions from Exam II, Homework problems, group work problems, and RAT鈥檚

Overall average for all students will be at or above 75%.

4. Students will use a state of the art computer analysis program to analyze fairly complex structures.

Projects and group work

Overall average for all students will be at or above 80%.

5. Students will calculate deflections for beams, trusses, and frames using classical methods.

Questions from Exam III and Final Exam, Homework problems, group work problems, and RAT鈥檚

Overall average for all students will be at or above 70%.

6. Students will analyze indeterminate beams and frames using approximate methods.  The student will also be able to analyze continuous beams using the consistent deformations method.

Questions from Exam III and Final Exam, Homework problems, group work problems, and RAT鈥檚

Overall average for all students will be at or above 70%.

 

 
ACTIONS TO BE IMPLEMENTED 

Involving Faculty

To encourage faculty involvement and recruit participants, throughout each year of the project the QEP Director will meet with Deans, Chairs, and other faculty leaders on an ongoing basis to identify instructors who may be interested in participating in the QEP.  After interested instructors have been identified, the QEP Director will meet privately or in small groups with them to solicit their involvement and also invite them to QEP activities including professional development and community building.  The QEP Director will then follow-up to ensure a smooth transition to the project.

Rogers (2003) aligned Lewin鈥檚 (1948) change theory with his innovation theory and categorized organizational adoption of a new innovation into five stages.  These stages were awareness (individuals are first exposed to a new innovation, but do not have any information regarding it), interest (individuals become interested in the innovation and seek out additional information), evaluation (individuals make a decision regarding the value or effectiveness of the innovation), trial (individuals use the innovation and determine its usefulness), and adoption (individuals finalize their decision regarding the innovation and continue or discontinue its use).

In order to maximize participation and project effectiveness, the QEP Director will embrace the dynamics of change (the use of TBL as an instruction strategy) by using a research-based strategy demonstrated to be successful. This is necessary as implementing new pedagogical strategies represents a major change and as such, has a substantial impact on faculty culture.  In order to maximize the chance of successful implementation of TBL strategies, the QEP Director will utilize an approach to implement change based on the conceptual framework of Kotter (1996).  The plan used by the QEP Director will be to:

  • Help create a sense of urgency and convince instructors of the need for utilization of TBL strategies
  • Help create and communicate a vision relative to the benefits of utilizing Team-Based Learning strategies
  • Identify and involve early adopters in the process of implementing Team-Based Learning strategies
  • Celebrate successful implementation of TBL Strategies by recognizing short-term success, analyzing implementation and making modifications to the plan as appropriate
  • Institutionalize the use of TBL Strategies

Phased Inclusion of Faculty

Members of an organization, in this case university faculty, have different ways of reacting to change. Rogers (2003) studied these reactions and developed categories to help leaders understand the personal dynamics involved and thus facilitate successful change. The change theory, called Diffusion of Innovation, includes 5 categories of organizational membership.  They are Innovators, Early Adopters, Early Majority, Late Majority, and Laggards.

Innovators typically represent approximately 2.5% of an organization.  Innovators tend to be the first to embrace change and may be engaging in practices not used by the majority of the personnel in the organization; they also do not necessarily communicate well with others within the organization. Early Adopters represent approximately 13.5% of an organization.  Early Adopters tend to be positive, willing to change to improve practice, and have the respect of the majority of personnel in the organization.  The Early Majority, approximately 34% of an organization, is influenced by the Early Adopters and the Late Majority.  The Laggards, approximately 16%, are extremely resistant to change, even when it is an improvement.  As Rogers鈥 research indicated, the key to organizational change is identifying the Early Adopters and doing whatever is possible for them to experience success. This strategy will ultimately lead to successful implementation of new initiative(s).

The QEP Director will focus his initial efforts on identifying and recruiting Early Adopters to participate in the project and then support and nurture them as they implement TBL Strategies. As the project grows and instructors experience success other instructors (Early and Later Majority) will be invited to participate. The QEP Director has initiated a process of identifying and recruiting Early Adopters and has 30 STEM instructors committed to the project.  These instructors will be phased in over the first three years of the project.  They include Diane Abercrombie, Waleed Al-Assadi, Brenda Beverly, Philip Carr, John Cleary, Jim Connors, Mary Anne Connors, Jennifer Coym, Jim Davis, Patti Davis, Catherin Dearman, Julie Estis, Joseph Farmer, Mimi Fearn, Chondra Freeman, Natalie Gauer, Coral Gubler, Jennie Gunn, Russel Hardin, Sue Hayden, Susan Hickey, Aletha Hill, Jeff Landry, Susan LeDoux, Dawn McKinney, David Nelson, Srinivas Palanki, Bettina Riley, Tim Sherman, Cindy Stanfield, Kandy Smith, Jennifer Styron, Alan Tucker, Julio Turrens, Bin Wang, and Sheila Whitworth.  Their courses include accounting, anthropology, biology, biomedical sciences, cell biology and neuroscience, chemical and biomolecular engineering, chemistry, civil engineering, computer and information science, earth science, electrical and computer engineering, geology, geography, mechanical engineering, nursing, pathology, pharmacology, physician assistant, speech pathology, and statistics.  Additional instructors from mathematics, mathematics education, physics, and science education will be recruited during years 3 and 4 of the project for participation in years 4 and 5 of the project.  Upon conclusion of the project (2017-18), there shall be no less than 50 participating STEM instructors.

The phased inclusion of faculty may be one of the most important action plan components. Recruiting instructors, beginning with the Early Adopters, and providing them with appropriate coaching and support cannot be overstated. Without coaching and support, the implementation of the project would fail (DuFour & Eaker,1998).  Focusing time, energy and resources on the recruitment of the wrong segment of the instructor population, relative to Rogers鈥檚 (2003) diffusion of innovation model, may lead to a lack of faculty participants.  

Professional Development

Exceptional institutions must have exceptional faculty; therefore, improving faculty skills and knowledge should be the highest priority, especially when implementing new initiatives (Holland, 2005).  Haar (2001) identified quality ongoing professional development as an essential component of faculty development and growth because if faculty members stop growing, their students also cease to grow. Faculty knowledge and skill have the greatest influence on student learning and achievement.  In order for faculty to be competent, they should be provided with ongoing high-quality professional development. It is a powerful tool for implementing innovation and the only way universities can move from where they are to where they want to be.

Louie and Hargrave (2006) stated that there are three different forms of staff development: 1) Formal professional development- 鈥渢echnology workshops, summer institutes, credit courses, and study groups鈥 (p. 15); 2) Ongoing or informal professional development- 鈥渃oaching, mentoring, and co-teaching鈥 (p. 15); and 3) Online professional development- 鈥渙nline courses and online workshops.鈥  The QEP professional development plan will incorporate all of these modalities.  Professional development will focus on instructional topics that support the QEP topic of TBL.  Professional development will be coordinated through the 老司机福利网 Innovation and Learning Center.  The QEP Director will work closely with the Innovation for Learning Center Director to identify expert presenters and arrange session logistics. Participants will enroll for professional develop sessions via the online system maintained by the Innovation for Learning Center.  Ongoing professional development will be provided for the duration of the project and will include summer, fall, and spring workshops personally conducted by Dr. Larry Michaelsen.

Professional development will be based on the broad concepts of a) assessment of higher order thinking, b) course redesign using Backward Design (Wiggins & McTighe, 1998, 2005), and c) construction of student learning outcomes around higher order thinking. Specific professional development topics, based on these broad concepts, will be personalized to support TBL strategies. 

Topics may be adjusted based on instructor needs and formative data generated by project assessments. Professional development topics will be directly related to the development, acquisition and delivery of TBL strategies (Table 18).  Seventy-one participants have attended professional development sessions held during the summer and fall of 2012.  Session topics were Cognitive Coaching, Critical Thinking, Designing Team-Based Learning Activities, Getting Started with Team-Based Learning, Reciprocal Questioning, Team-Based Learning Introduction, Team-Based Learning Overview, and Using Technology to Support Team-Based Learning. 

Table 18          
Professional Development Topics and Content

Professional Development Topic

Content

Application Activities

Activities designed to stimulate discussion, decision-making, engagement, collaboration and thought.  Examples include case studies, scenarios, and problem solving

Coaching*

Developing support mechanisms and strategies for helping instructors talk though TBL problems while also identifying solutions.  Coaching sessions should be organized in small groups by college

In-Class Activities (4 S鈥檚)

Utilizing the 4S strategy

  • Significant Problems 
  • Same problem
  • Specific Choice
  • Simultaneous Reporting

Lesson Design

Utilizing the principles of Backward Design, which include beginning with the desired results then determining evidence of learning and ending with curriculum and instruction

Questioning Strategies

Asking thought-provoking questions based on knowledge transfer and higher-level thinking skills.  Also allowing time for reflection

Peer Evaluation

Designing student peer evaluation procedures

Readiness Assurance Item Construction

Developing test items based on the assessment of higher order thinking skills

Student Learning Outcomes

Designing student learning outcomes based on the development of critical thinking skills (analysis, synthesis and evaluation) as found in Bloom鈥檚 Taxonomy.  These skills include analysis, synthesis, and evaluation.

Team Construction

The purpose construction of teams around a set of pre-determined criteria (I.e. major, grade level, age) established by the instructor.  Instruments such as the Myers-Briggs personality indicator may also be utilized to determine team construction.

Technology

The utilization of TBL in blended learning and online courses

Validating Course Assessments

Validating course-level assessments prior to utilization

 

*Coaching.  One of the most important aspects of the professional development series will be collegial coaching.  Faculty coaching faculty is a methodology based not only on
teacher-leader visibility but, more importantly, on the ability of leaders to communicate effectively with teachers, focus on those matters directly related to curriculum, instruction, and assessment, and have knowledge of those student behaviors that provide evidence of meaningful long-term learning.  Coaching facilitates effective professional development and helps break down the cycle of
teacher isolation. It also serves as a communicative structure to inform teachers of practices that work (Waters, Marzano, & McNulty, 2003).  Professional development followed by support and coaching has a 90% rate of successful implementation (McCray, 2011).  Coaching sessions represent productive conversations between faculty members regarding student learning.  They can also serve as recognition of small wins as new innovations are being implemented (Schmoker, 1999).  Coaching sessions, along with QEP professional development, can also foster collegial teams to foster course content innovation as discussed in the excepted concept proposal. Each semester the QEP Director will conduct monthly coaching sessions.  Dr. Michaelsen will also participate in a coaching session held during the first quarter of the academic year.                                        

USA Certificate in Team-Based Learning Pedagogy

A certificate in TBL Pedagogy will be issued through the Senior Vice-President鈥檚 office for participants who meet rigorous professional development standards.  To qualify, participants must attend a minimum of five professional development activities per academic year, participate in monthly coaching sessions, submit no less than three application activities per semester, and submit a professional presentation, manuscript, or grant proposal for consideration to the appropriate organization or agency based on lessons learned from this experience.  The action plan for this project will consist of actions related to logistics, faculty participation and presenter acquisition.

The TEAM USA Classroom

During the fall of 2012, the university designed and began renovating a classroom into a student-centered, technology-rich center designed to support Team-Based Learning. Scheduled for completion in early spring, 2013, the center will be used to model TBL lessons, conduct workshops, hold QEP Advisory Council meetings, provide collegial coaching, and provide a venue for observation of TBL strategies in practice.  Activities in the team classroom will be coordinated by the QEP Director.  The Center has been designed around 14 characteristics of a successful active-learning classroom as indicated by best practices reported by the University of North Carolina at Charlotte (Characteristics of Collaborative Classrooms, 2012).  These include:

  1. Multiple electronic display surfaces oriented on different walls
  2. Good portion of perimeter walls are made up of writing surfaces (whiteboards
  3. Lightweight, moveable and reconfigurable furniture
  4. Mobile Instructor station
  5. Remote control of audiovisual equipment
  6. Wireless and hardwired network connectivity available
  7. Lighting coordinated with projection screen and sensors to automatically turn on/off lighting when room in use
  8. HVAC System/Acoustical consideration (quiet, independent room controls, room-to-room sound isolation
  9. Computer availability
  10. Student work surfaces
  11. Dedicated computer and DVD player
  12. Ceiling mounted speakers
  13. Teleconferencing to enable collaborating with remotely located groups and guest lecturers
  14. Secure closed equipment niche

QEP Advisory Council Activities

Tables 19 and 20 include a summary of the major activities of the QEP Advisory Council. 

Table 19
QEP Advisory Council 2012-13 Activities

Major Tasks

Oct

Nov

Dec

Jan

Feb

Mar

Apr

May

Jun

Au

Review of QEP document

 

X

X

X

 

 

 

 

 

 

Approval of QEP Bylaws

 

X

 

 

 

 

 

 

 

 

Approval of QEP document

 

 

 

X

 

 

 

 

 

 

Marketing of QEP

 

 

X

X

X

X

 

 

 

 

Report of Pilot Data/Spring project modifications

 

 

 

X

 

 

 

 

 

 

Submission of QEP report

 

 

 

 

X

 

 

 

 

 

SACS Team Visit

 

 

 

 

 

 

X

 

 

 

Report of Spring Data/Fall project modifications

 

 

 

 

 

 

 

 

X

 

Annual Report

 

 

 

 

 

 

 

 

 

X

 

Table 20
QEP Advisory Council 2013-18 Activities

 

2013-14

2014-15

2015-16

2016-17

2017-18

Major Tasks

Au

Jan

Au

Jan

Au

Jan

Au

Jan

Au

Jan

Appointment of Council Members

X

 

X

 

X

 

X

 

X

 

Report of project data, discussion and implementation of  modifications

 

X

 

X

 

X

 

X

 

X

 

X

 

X

 

X

 

X

 

X

Annual Report

X

 

X

 

X

 

X

 

X

 

 

 

TIMELINE

The 老司机福利网 has established a calendar of QEP actions to be implemented beginning with the pilot year, 2012-2013, and ending with the final year of implementation, 2017-18.  They are listed in Table 21 鈥 Table 23.

Table 21
QEP Timeline Details, 2012-14

 

DESCRIPTION

PILOT
2012-13

2013-14
YR 1

F
A

SP
R

SU
M

F
A

S
P
R

Su
M

Objective

Action

Person(s) Responsible

 

 

 

 

 

 

Logistics

  • To conduct regular meetings with QEP Advisory Council to solicit their input and provide regular updates
  • QEP Advisory Council Meeting on the 3rd Monday of each month at 8:30 a.m. and as needed
  • QEP Director
  • Student Assistant

 

X

 

X

 

X

 

X

 

X

 

X

  • To develop a public awareness and involvement in the QEP
  • Meet with design professor and students to discuss QEP
  • Have students design a logo and develop a PR plan
  • Select logo and implement marketing plan
  • QEP Director
  • Design Instructor
  • QEP Advisory Council
  • USA Public Relations

 

 

 

X

 

 

 

X

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • To secure and distribute classroom testing forms
  • Order IF-AT forms
  • QEP Director
  • Student Assistant

 

X

 

X

 

X

 

X

 

X

 

X

  • To coordinate faculty celebration of success
  • Order refreshments
  • Request letters of commendation
  • Reserve room
  • Notify instructors and administrators as appropriate
  • QEP Director
  • Student Assistant

 

X

 

X

 

 

X

 

X

 

  • To develop a Sakai QEP website
  • Create a Sakai website with the ability to house application activities and resources developed by instructors to support the implementation of TBL Strategies.  Other features will include announcements, calendar, messages, mailtool, email archive, chat room, wiki, discussion forums, blogs, polls, and media gallery features.
  • QEP Director
  • ILC Director

 

 

X

 

 

 

 

 

  • To develop a USA QEP website
  • Create a USA QEP website linked to the Office of Academic Affairs and USA home page which contains information pertaining to the QEP
  • QEP Director
  • USA Public Relations

 

 

X

 

 

 

 

 

  • To maintain the USA QEP and Sakai websites
  • Regular monitoring, updates and modifications to the websites as needed
  • Post application activities
  • QEP Director
  • ILC Director

 

X

 

X

 

X

 

X

 

X

 


X

  • To create the QEP written report
  • Composition of the Quality Enhancement written plan with input from various constituents including the advisory council, the ILC Director, the Institutional Research, Planning and Assessment Association Vice President  and the Senior Vice President of Academic Affairs
  • QEP Director
  • Senior Vice President

 

 

 

 

 

 

  • To renovate a large classroom into a QEP professional development and demonstration center, and location for Team-Based Learning classes (when needed)
  • Gather input from Advisory Council, Electronics specialist and lab directors
  • QEP Director

 

X

 

X

 

 

 

 

  • To secure furniture for the professional development and demonstration center
  • Gather input from Advisory Council, furniture specialist, and lab directors
  • Order and receive furniture
  • QEP Director

 

X

 

X

 

 

 

 

Assessment

  • To utilize assessment results for project modification
  • Design project assessments
  • Administer assessments
  • Collect data at the beginning and end of each semester
  • Review, analyze and organize data
  • Present Data at QEP Advisory Council Meeting
  • Make adjustments and modifications as needed
  • QEP Director
  • IRPA Director of Assessment
  • Student Assistant

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

Faculty Induction

  • To recruit faculty participants for participation in the QEP
  • Meet with Deans and Chairs to discuss and recommend project participants
  • Meet privately or in small groups with interested instructors to solicit their involvement
  • Invite interested instructors to QEP activities, including professional development and community building
  • QEP Director
  • QEP Advisory Committee
  • Senior Vice President

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

  • To phase in faculty participants
  • Induction of Accounting, Anthropology, Biomedical Sciences, Computer and Information Science, Geography, Speech Pathology, and Statistics faculty into the QEP over the course of the project
  • QEP Director

 

 

 

 

 

X

 

 

X

 

 

X

 

 

X

Professional Development

  • To conduct professional develop workshops complete with presenters, materials and supplies, stipends and certificates.
  • Select topics that will support TBL strategies
  • Identify and secure presenters with TBL expertise and experience
  • Award stipends for 9-month workshop participants
  • Allocate funds for materials and supplies
  • Award certificates of participation
  • Award certificates of pedagogy to those who meet requirements
  • QEP Director
  • ILC Director
  • Faculty Development Services Director
  • Advisory Council

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

  • To coordinate and facilitate monthly follow-up sessions
  • Coaching and follow-up sessions
  • QEP Director

 

X

 

X

 

X

 

X

 

X

 

X

 

Table 22
QEP Timeline Details, 2014-16

 

DESCRIPTION

2014-15
YR 2

2015-16
YR 3

F
A
L
L

S
P
R

S
U
M

F
A
L
L

S
P
R

SU
M

Objectives

Action

Person(s) Responsible

 

 

 

 

 

 

Logistics

  • To conduct regular meetings with QEP Advisory Council to solicit their input and provide regular updates
  • QEP Advisory Council Meeting on the 3rd Monday of each month at 8:30 a.m. and as needed
  • QEP Director
  • Student Assistant

 

X

 

X

 

X

 

X

 

X

 

X

  • To secure and distribute classroom testing forms
  • Order IF-AT forms
  • QEP Director
  • Student Assistant

 

X

 

X

 

X

 

X

 

X

 

X

  • To coordinate faculty celebration of success
  • Order refreshments
  • Request letters of commendation
  • Reserve room
  • Notify instructors and administrators as appropriate
  • QEP Director
  • Student Assistant

 

X

 

X

 

 

X

 

X

 

  • To maintain the USA QEP and Sakai websites
  • Regular monitoring, updates and modifications to the websites as needed
  • Post application activities
  • QEP Director
  • ILC Director

 

X

 

X

 

X

 

X

 

X

 


X

Assessment

  • To utilize assessment results for project modification
  • Design project assessments
  • Administer assessments
  • Collect data at the beginning and end of each semester
  • Review, analyze and organize data
  • Present Data at QEP Advisory Council Meeting
  • Make adjustments and modifications as needed
  • QEP Director
  • IRPA Director of Assessment
  • Student Assistant

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

Faculty Induction

  • To recruit faculty participants for participation in the QEP
  • Meet with Deans and Chairs to discuss and recommend project participants
  • Meet privately or in small groups with interested instructors to solicit their involvement
  • Invite interested instructors to QEP activities including professional development and community building
  • QEP Director
  • QEP Advisory Committee
  • Senior Vice President

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

  • To phase in faculty participants
  • Induction of Biology, Electrical and Computer Engineering, Mechanical Engineering, Chemical and Biomolecular Engineering and Geology faculty into the QEP
  • QEP Director

 

 

X

 

X

 

X

 

 

 

 

 

 

  • To phase in faculty participants
  • Induction of Cell Biology and Neuroscience, Chemistry, Earth Science and Nursing into the QEP
  • QEP Director

 

 

 

 

 

X

 

X

 

X

Professional Development

  • To conduct professional develop workshops complete with presenters, materials and supplies, stipends and certificates.
  • Select topics that will support TBL strategies
  • Identify and secure presenters with TBL expertise and experience
  • Award stipends for 9-month workshop participants
  • Allocate funds for materials and supplies
  • Award certificates of participation
  • Award certificates of pedagogy to those who meet requirements
  • QEP Director
  • ILC Director
  • Faculty Development Services Director
  • Advisory Council

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

  • To coordinate and facilitate monthly follow-up sessions
  • Coaching and follow-up sessions
  • QEP Director

 

X

 

X

 

X

 

X

 

X

 

X


Table 23
QEP Timeline Details, 2016-18

 

DESCRIPTION

2016-17
YR 4

2017-18
YR 5

FA
L
L

S
P
R

SU
M

F
A
L
L

S
P
R

S
U
M

Objectives

Action

Person(s) Responsible

 

 

 

 

 

 

Logistics

  • To conduct regular meetings with QEP Advisory Council to solicit their input and provide regular updates
  • QEP Advisory Council Meeting on the 3rd Monday of each month at 8:30 a.m. and as needed
  • QEP Director
  • Student Assistant

 

 

X

 

X

 

X

 

X

 

X

 

X

  • To secure and distribute classroom testing forms
  • Order IF-AT forms
  • QEP Director
  • Student Assistant

 

X

 

X

 

X

 

X

 

X

 

X

  • To coordinate faculty celebration of success
  • Order refreshments
  • Request letters of commendation
  • Reserve room
  • Notify instructors and administrators as appropriate
  • QEP Director
  • QEP Assistant Director

 

X

 

X

 

 

X

 

X

 

  • To maintain the USA QEP and Sakai websites
  • Regular monitoring, updates and modifications to the websites as needed
  • Post application activities
  • QEP Director
  • ILC Director

 

X

 

X

 

X

 

X

 

X

 


X

Assessment

  • To utilize assessment results for project modification
  • Design project assessments
  • Administer assessments
  • Collect data at the beginning and end of each semester
  • Review, analyze and organize data
  • Present Data at QEP Advisory Council Meeting
  • Make adjustments and modifications as needed
  • QEP Director
  • IRPA Director of Assessment
  • Student Assistant

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

  • To develop a QEP summary report
  • Collect and organize assessment data
  • Analyze and report all QEP assessment data
  • QEP Director
  • IRPA Director of Assessment
  • Student Assistant

 

 

 

 

 

 

X

 

 

X

Faculty Induction

  • To recruit faculty participants for participation in the QEP
  • Meet with Deans and Chairs to discuss and recommend project participants
  • Meet privately or in small groups with interested instructors to solicit their involvement
  • Invite interested instructors to QEP activities including professional development and community building
  • QEP Director
  • QEP Advisory Committee
  • Senior Vice President

 

 

 

 

X

 

 

 

 

X

 

 

 

 

X

 

 

 

 

 

 

 

 

 

 

 

 

  • To phase in faculty participants
  • Induction of  Pharmacology, Physician Assistant, Physics and Speech Pathology faculty into the QEP
  • QEP Director

 

 

X

 

X

 

 

 

 

  • To phase in faculty participants
  • Induction of Mathematics, Mathematics Education and Science Education faculty into QEP
  • QEP Director

 

 

 

 

X

 

X

 

Professional Development

  • To conduct professional develop workshops complete with presenters, materials and supplies, stipends and certificates.
  • Select topics that will support TBL strategies
  • Identify and secure presenters with TBL expertise and experience
  • Award stipends for 9-month workshop participants
  • Allocate funds for materials and supplies
  • Award certificates of participation
  • Award certificates of pedagogy to those who meet requirements
  • QEP Director
  • ILC Director
  • Faculty Development Services Director
  • Advisory Council

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

X

 

 

 

  • To coordinate and facilitate monthly follow-up sessions
  • Coaching and follow-up sessions
  • QEP Director

 

X

 

X

 

X

 

X

 

X

 

X

Timelines were disaggregated by categories consisting of objectives, action, person responsible, and year.  These categories contained brief summaries of actions to be implemented during 2012-2018 of the QEP process.

 

ORGANIZATIONAL STRUCTURE

The purpose of this section is to delineate and clarify the roles and responsibilities of those charged with planning, organizing, and implementing the project; those charged include the QEP Director, the QEP Advisory Council, the Innovation for Learning Center Director, the Institutional Research, Planning and Assessment Director, the Student Assistant, the Faculty Development Services Director, and the Senior Vice President of Academic Affairs.  A flow chart is presented to document a clear chain of command between the QEP Director and upper university administration.

Personnel Roles and Responsibility

The Senior Vice President for Academic Affairs:  The Senior Vice President for Academic Affairs is the Chief Academic Officer for the institution.  He will oversee the QEP process and supervise the QEP Director and provide support and guidance where appropriate.

The QEP Director:  The QEP Director will administer and coordinate the Quality Enhancement Plan.  The QEP Director will be responsible for:

  • Administration of the QEP budget
  • Approval of assessment measures to be embedded in each course
  • Celebration of Success
  • Coordination and monitoring of assessment
  • Collection of data for each course at the end of the semester
  • Compiling the 5-year QEP SACS report
  • Coordination of QEP coaching and professional development
  • Coordination of the TEAM USA classroom
  • Faculty recruitment
  • Inclusion of project courses
  • Maintenance of QEP and Sakai websites
  • Administration of Assessments
  • Organization and analysis of data
  • Service on the Institutional Effectiveness Committee

 

Administrative Assistant:  The Administrative Assistant will perform general administrative duties/office support for the director.  Responsibilities will include acting as receptionist, scheduling, maintaining calendar, preparing correspondence, and other documents as needed.

Student Assistant:  The Student Assistant will enter assessment data, assist with distribution of materials and supplies (i.e. IF-AT forms) for QEP instructors, and assist with any other duties required to deliver the QEP as assigned by the QEP Director.

The Institutional Research, Planning and Assessment, Director of Assessment: The Institutional Research, Planning and Assessment, Director of Assessment will serve as a resource for developing assessments and collecting and analyzing data generated by project assessments.

The Director of the Center for Innovation and Learning: The Director of the Center for Innovation and Learning will assist in the creation and maintenance of project websites and serve as a resource for planning, staging, and analyzing the effectiveness of professional development activities.

The Faculty Development Services Director:  The Faculty Development Services Director will help secure presenters, plan and stage sessions, and serve as a resource for analyzing the effectiveness of professional development activities.

The QEP Advisory Council: The QEP Advisory Council will guide the QEP process and provide              recommendations relative to the delivery of the Plan.  The Council will also provide assistance with community events and celebrations.

Chain of Command

The Innovation for Learning Center and Institutional Research, Planning and Assessment Director shall have a lateral relationship with the QEP Director.  The Faculty Development Services Director will work under the supervision of the Innovation for Learning Center Director.  The Administrative Assistant and Student Aide will work under the supervision of the QEP Director.  The QEP Advisory Council shall serve in an advisory capacity (Figure 2).

Macintosh HD:Users:j00462234:Desktop:QEP flow chart.pdf 
Figure 2.  QEP Organizational Flow Chart. 

The QEP Director has been allocated adequate human resources including administrative support and a student aide.  The QEP Director has also been allocated effort of key personnel including the Director of the Innovation in Learning Center, The Director of Faculty Development Services and the Director of Assessment.  Additionally, the QEP Director is a direct report to the Senior Vice President of Academic Affairs thus ensuring an open line of communication with upper university administration.

 

ASSESSMENT AND EVALUATION

Hypothesis

The integration of critical thinking skills and collaboration into STEM courses will lead to improved student-learning outcomes.

Phases of the Assessment and Evaluation Process

The purpose of the TEAM USA is to improve student learning in STEM courses by increasing student critical thinking and collaborative skills through the utilization of Team-Based Learning instructional strategies.  In order to adequately evaluate the effectiveness of the QEP plan, we need to engage in a multilevel assessment plan with both formative and summative components (Table 25). 

Table 26
Assessment Overview

Assessment Overview

Hypothesis

The integration of higher order thinking skills and collaboration into STEM courses will lead to improved student learning outcomes

Objective

Improved student learning outcomes

Student Outcomes

Student Learning in Content areas

Critical Thinking Skills

Collaboration

Formative Assessments*

iRATS and tRATS [D]

ETS [D] Pre-Test

Peer Reviews [D]

Summative Assessments*

Final Exam or Capstone Course Activity [D]

 

NSSE [I] and ETS [D] 
Post-Test

NSSE [I] and/or Faculty Panel Review [D]

Student Satisfaction Surveys [I]

Faculty Satisfaction Surveys [I]

*Comparison Groups or Targets
[D]鈥擠irect Assessment
[I]鈥擨ndirect Assessment

The diagram below (Figure 3) serves as an outline of the basic logic model for the project.  As can be seen below, the ultimate outcomes are improvement of retention of students in STEM majors and improvements in the instructional quality of STEM programs. Student learning outcomes that are part of this include: greater learning in STEM classes, improved critical thinking skills, and improved ability of students to collaborate.  The main intervention to reach that goal is the use of TBL.  The success of TBL, in turn, is a function of the resources put into providing faculty with the professional development, the success of professional development, and faculty implementation of TBL with fidelity.  All of these elements have to be part of the assessment plan for the project to work.  Hence, each aspect of these will be discussed as part of the formative and summative parts of the evaluation plan.

Macintosh HD:Users:j00462234:Desktop:Assessment.pdf

Figure 3.  Outcomes, Intervention, Inputs

While the main portion of this section focuses on the outcomes, it is important to assess the context of the intervention including the resources associated with the project.   Additionally, mechanisms for assessing the fidelity of implementation need to be put into place once the pilot part of the project is complete and the university moves into its full implementation phase. 
Phases of the Assessment and Evaluation Process

Figure 4 represents an overview of the process that guides the assessment of the project. The project is guided by an action-research orientation that focuses on attending to data, making plans based on data,implementing those plans, evaluating the outcomes, making revisions, and then re-engaging in the plan, act, and evaluate cycle.   The project is currently at the phase of implementing the TBL pilot project at this point in time. The pilot part of the assessment plan mirrors the assessment plan that will be carried out over the course of the project with adjustments made based on the results of the pilot.

Macintosh HD:Users:j00462234:Desktop:Action Research QEP.pdf

Figure 4.  Design of the Evaluation of the Pilot Phase of the Project

In fall of 2012, a pilot group of six courses with instructors who volunteered to implement TBL was undertaken.  The courses included in this pilot were: Beginning Programming, Information Systems/Information Technology, Project Management, Issues in Biomedical Science, Landscaping Patterns and Processes, and Physical Anthropology.  One purpose of the pilot was to examine the implementation of TBL to provide formative feedback into the process of faculty training and development.  A second purpose was to test the various assessments that the project plans to use. Below is a list of the courses and the number of students enrolled in them.  The Table 26 indicates whether there is a comparison group that is easily assessed that can be used to gauge the impact of TBL. 

Table 27
Comparison Groups

Content Area

Course

N

Comparison Group
Fall or Spring

 

Anthropology

AN 210 Physical Anthropology

64

Yes Spring 
Different Instructor

Civil Engineering

CE 384 Structural Analysis

22

No

Geography

GEO 105:  Landscape Processes
(Physical Geography)

92

Yes Spring Semester 
Different Instructors

Computer Science/Info systems

ITE 475 Information Systems/Information
Technology, Project Management

14

YES Spring 
Same Instructor

Computer Information Systems

CIS 115  Beginning Programming

19

Yes Both Semesters

Biomedical Sciences

BMD 493 Issues in Biomedical 
Science

19

Yes Fall

 

Classroom Level Analysis of TBL Effectiveness

The project will not set up matched groups or randomly assign students or instructors to use or not use TBL.  Hence, non-equivalent comparison groups, pretest-posttest comparisons, or indications of student performance meeting standards will be used to assess the effectiveness of TBL in classes. The exact comparison, difference, or standard to be met will depend upon the class. Non-equivalent comparison groups will be comparison classes of the same course that occur either before, concurrent with, or after the courses that use TBL. These comparison groups are non-equivalent groups because they vary on several dimensions other than just the presence of TBL in the classroom. For example, a section taught by a professor who is a volunteer to do TBL may be different from one taught by another professor because the TBL instructor is a volunteer. Individuals who are open to innovation and instructional improvement may already be better instructors than those who are not open to change and do not volunteer. Their nonequivalence limits the inferences we can make about the effectiveness of TBL from any one class, but given that the pattern of performance based on these comparisons, as well as other approaches we use (e.g., pretest to posttest changes, meeting performance standards), we can infer positive support for TBL. Further, we will collect qualitative data from surveys and will observe students and professors to provide additional evidence to evaluate TBL. We will use a 鈥淪uccess Case鈥 logic (Brinkerhoff, 2003) where we examine courses with exceptional outcomes for evidence to suggest that TBL or some other factor (e.g., the instructor) accounted for exceptional outcomes.

Project-wide Analysis of TBL Effectiveness

Along with examining the impact within the classroom, project-wide comparisons of critical thinking, engagement, and skill in collaboration (the three emphases of TBL) will be undertaken.  Students from the TBL pilot classes will be compared to randomly selected groups of students on the ETS Proficiency Profile, course engagement, and their self-reported (and/or peer reported) ratings of their ability to work in teams. The size and statistical significance of these comparisons should provide additional evidence that TBL is effective. When possible, we will attempt to control these comparisons for preexisting differences by using ACT scores, GPA, or other measures. 

Another project-wide aspect of the QEP concerns course withdrawal, student satisfaction based on course evaluations, and student grades.  Withdrawals and grade distributions will be compared to rates over the last three years of offering the course.  Student satisfaction and other aspects of student course evaluation will be collected through the 鈥淐lass Climate鈥 (Scantron) system.  Student responses to the standard set of questions along with a subset of questions that ask about traits that are part of successful use of TBL (e.g.., engagement collaboration, higher order learning) will be included.

Fidelity of Implementation

Along with analyzing outcomes, the project will examine the fidelity of instructors to the TBL approach using as a guide the framework developed by Century, Rudnick, and Freeman (2010).  The framework talks about structurally critical components (Does the instructor know how to organize a TBL lesson, and is he or she aware of what belongs in it?) and instructionally critical components (Does the instructor use the pedagogical strategies that are part of TBL in class? Are the students engaged?). The TBL scorecard will be used along with a self-report questionnaire for professors and students and a protocol for observing professors using the TBL materials and consultants.

Measures Used in Comparisons

Both in the pilot phase and the larger scale initiative, procedures will be utilized to ensure that the measures of learning outcomes are valid and reliable indicators. When possible, for project wide assessments standardized valid and reliable measures will be used.  There will also be a process put into place for examining the reliability and validity of measures. 

Course Outcomes

A project assessment committee will work with instructors no later than the semester prior to implementation of the project to validate their classroom assessments, including the final exam. At a minimum, claims that test questions, assignments, or other assessments measure higher order thinking skills will require agreement by the assessment committee. Content validity indices, such as those described by Shultz and Whitney (2005), will be carried out. As is course appropriate, assessments will include case studies, essays, portfolios, scenarios, and tests. Inter-rater reliability checks will be conducted on assessments that are subjective in nature.

Critical Thinking

The project will include a process that will ensure that the assessment of critical thinking is both reliable and valid. First, part of the training in TBL will involve developing critical thinking assessment.  In addition, a QEP critical thinking assessment committee made up of faculty trained in Bloom鈥檚 Taxonomy will review assessments from classes to ensure that faculty learning outcomes do measure critical thinking. A process similar to that developed by Rovinelli and Hambleton (1977) to measure item congruence with objectives will be used to create an index of congruence with higher levels of Bloom鈥檚 taxonomy. That index will be used to determine whether test items or other assessments match critical thinking levels of Bloom鈥檚 taxonomy.

The project will also use a standardized test, the ETS Proficiency Index to measure critical thinking on a project-wide basis. The ETS test has several advantages. First it is used by as part of USA鈥檚 involvement in the Voluntary System of Accountability (VSA).  Hence, we can track the changes for the institution and also use the group that who took the ETS Proficiency Index as a comparison group to students who have participated in TBL.  Further, we can track the institutional scores in comparison to other institutions that are part of the VSA.  Second, its short form can be given in a reasonable amount of time without the need to have faculty time taken grading the exam. Thus, logistically it works well. 

Collaboration

Aside from direct observation, which is too costly to do on a large-scale basis, the only way to assess collaborative skills among students is to use peer and faculty judgment.  TBL requires students to work together for an entire semester and make regular decisions as a group that count toward their grades.  As a result, peers will have a great deal of evidence to use to make reliable and valid judgments about their group members. Likewise, faculty members have multiple opportunities to observe how well groups work together.  They, too, have empirical observations that they can use to help determine the effectiveness of students as collaborators in groups. Students will use the USA Peer Evaluation Rubric.  Given multiple individuals will evaluate each group member, the reliability of the instrument can be tracked.  Further, the degree to which peer and faculty evaluations coincide can also be used to determine the consistency of collaboration ratings.

Satisfaction and Beliefs

There are a variety of measures surrounding student and faculty beliefs and satisfaction with TBL.  These measures provide multiple indicators that can help us triangulate these data to find consistent patterns.  There are course questionnaires, TBL questionnaires, and standard course evaluations.  Consistency in both the quantitative and qualitative data surrounding these issues will provide reliability evidence to support the continued use of these instruments.

Withdrawals and Course Grades

Both withdrawals and course grades are available through the USA Banner system. Course grades and withdrawals will be examined across a three-year window to see if there are changes.  Based on chapters from Michaelson et al. (2002), the evidence suggests that withdrawal rates should decrease and grades should improve. The patterns of change in withdrawal rates and grades will be tracked to see if the pattern is similar to other TBL implementations.

Data Analysis

As is noted above, the project will involve both course and project-wide assessments. In some cases, there will be comparison groups, whereas for other analyses, the primary focus will be descriptive (were the students satisfied, did the requisite number meet mastery on a course assessment). Course level comparisons will largely be descriptive.  Numbers of students reaching mastery, improving from pretests to posttests, or mean differences will be examined.  For project-wide outcomes, where sampling is more appropriate, group comparisons will be carried out using parametric or non-parametric statistics where appropriate (e.g., independent-t test or ANCOVA for Mean differences and (chi-squares or logistic regression for differences in proportions). If possible, comparisons between groups will attempt to control for student differences by controlling GPA, class level (freshman, sophomore, junior, senior), and ACT score.  At the very least, even if these variables cannot be used, whether those enrolled in the TBL courses are different from those who are in non-TBL sections may be analyzed.  The numbers in the pilot may preclude some of the inferential analyses on the course level, but the N of 230 students in TBL makes some inferential statistical analysis viable. Below are sets of goals for the pilot, indicators of success, and approaches to analyzing the data. 

Strategic Goal #1: To improve student-learning outcomes in STEM courses

Indicator: Students enrolled in STEM courses selected for this project will achieve mastery levels of student learning outcomes.

Research Question: Are there higher mastery levels of student learning outcomes in STEM courses where TBL instructional strategies are used?

Table 28
Pilot Strategic Goal #1 Data Analysis

Direct
Measures

Data

Statistical Procedure

Analysis
Procedure

Course assessments

 

 

 

Course assessments

Assessment data regarding mastery of student learning outcomes at targeted levels of achievement

and/or

Student scores from TBL sections and student scores from non-TBL sections

Descriptive

 

 

 

Non Equivalent Group Comparison

Summarization of test scores relative to target scores

 

and/or

Comparison of test scores of students from TBL courses with test scores of students from identical non-TBL course sections offered during the same semester or during a previous semester

Data Collection Procedures--

Student learning outcomes shall be assessed during the semester. Data will be collected at the end of each semester.

Indirect
Measures*

Data

Statistical Procedure

Analysis
Procedure

Student and Faculty satisfaction surveys administered in pilot courses (end of semester)

Data regarding the perceived effectiveness of TBL strategies relative to student learning outcomes.

Descriptive

Calculation of mean scores

 

Strategic Goal #2: To improve student learning outcomes associated with critical thinking skills in STEM disciplines

Indicator:  Students enrolled in STEM courses will demonstrate analysis, synthesis and evaluation.

Research Question: Are there higher levels of critical thinking skills among students enrolled in STEM courses where TBL instructional strategies are used?

Table 29
Pilot Strategic Goal #2 Data Analysis

Direct
Measures

Data

Statistical Procedure

Analysis
Procedure

Critical Thinking Assessment 鈥揈TS Proficiency, Short Form

ETS proficiency scores of students from TBL and non-TBL sections

Nonequivalent group comparison

Comparison of ETS proficiency scores of students from TBL and non-TBL sections

Data Collection Procedures--

The ETS proficiency shall be administered/collected at the end of each semester to a random sample of students enrolled in courses where TBL is utilized and a random sample of students enrolled in identical courses where TBL is not utilized. 

Indirect
Measures*

Data

Statistical Procedure

Analysis
Procedure

Student and Faculty satisfaction surveys administered in pilot courses (end of semester)

Data regarding the perceived effectiveness of TBL strategies relative to critical thinking

Descriptive

 

 

 

Calculation of mean scores

 

Strategic Goal #3: To improve the collaborative skills of students enrolled in STEM courses.

Indicator: Students enrolled in STEM courses selected for this project will work in teams to make decisions and solve problems.

Research Question: Are there higher levels of collaborative skills among students enrolled in STEM courses where TBL instructional strategies are used?

Table 30
Pilot Strategic Goal #3 Data Analyses

Direct
Measures

Data

Statistical 
Procedure

Analysis

Peer Review

 

Data regarding levels of collaboration within teams

Descriptive

Review of review responses, coding and sorting them into categories using a nominal ordinal method, recording the relative frequency for each response category looking for themes

Data Collection Procedures--

Student Peer Reviews will be collected at the end of each semester by course instructors.  Responses will be compiled and scored with a common collaboration rubric by a faculty panel.

Indirect
Measures*

Data

Statistical
Procedure

Analysis

Student and Faculty satisfaction 
surveys 
administered in   pilot courses (end of semester)

 Data regarding 
the effectiveness 
of Team-Based 
Learning  
strategies 
relative to 
collaboration

Descriptive

Calculation of mean scores

 

Strategic Goal #4: To improve student engagement in STEM courses.

Indicator: Students enrolled in STEM courses selected for this project will be actively engaged in instructional episodes during class.

Research Question: Are there differences in student engagement among students enrolled in STEM courses where TBL instructional strategies are used?

Table 31
Pilot Strategic Goal #4 Data Analysis

Direct
Measures

Data

Statistical 
Procedure

Analysis 
Procedure

New Jersey 
Medical School 
Survey (questions 1-8 on student satisfaction survey)

 Data regarding 
levels of collaboration
within teams

Descriptive

Calculation of mean scores

Data Collection Procedures--

Student satisfaction surveys, including questions from NJ Survey instrument will be administered/collected at the end of each semester.

Indirect
Measure*

Data

Statistical
Procedure

Analysis
Procedure

Student and Faculty satisfaction surveys (end of semester)

Data regarding the effectiveness of TBL strategies relative 
to collaboration

Descriptive

 

Calculation of mean scores

 

Strategic Goal #5: To improve the retention rate of students enrolled in STEM courses.

Indicator: Students enrolled in STEM courses will show higher levels of persistence than    students who are not.

Research Question: Are there lower withdrawal rates among students enrolled in STEM courses        where TBL instructional strategies are used?

Table 32 
Pilot Strategic Goal #5 Data Analysis

Direct
Measures

Data

Statistical Procedure

Analysis
Procedure

 Student   
withdrawals

 Fewer student 
withdrawals from 
STEM courses as 
compared to student 
withdrawals from non- 
STEM courses

Nonequivalent group comparison

 

 

Comparison of withdrawal rates between students enrolled in TBL courses and those who are not

 

Data Collection Procedures--

Student data will be compiles using the university data management system (BANNER)

 

Full Implementation Phase Assessment Plan

After the pilot is completed, adjustments will be made in the assessment plan based on: a) how students and faculty responded to the questionnaires and surveys; and b) any evidence to indicate that there are issues related to the reliability and validity of the assessment tools used.  Further, if the preponderance of evidence indicates that professors have not been successful in implementing the TBL model, the plan for training and professional development will be revised.  Because this project involves an action research plan, each iteration that brings in new participants may lead to further revisions of both the TBL intervention and the assessment plan. 

Enrollment and Persistence in Using TBL

One part of the full implementation plan that needs to be tracked is the degree to which individuals who participate continue to participate over time after they have started using TBL.  Additionally, since individuals come into the QEP project voluntarily, it is important to track who decides to get involved and for what reason.  For example, do individuals who decide to join into the project to get external 鈥減erks鈥 (release time, technology, etc.) continue to use TBL or some variation of it after they have received those 鈥減erks鈥? What are the characteristics of faculty who continue to use TBL? How much time and effort is required for professors to move their courses into a TBL format? Are there pedagogically effective variations on the basic TBL model that faculty develop?  Do faculty who use TBL in one course, decide to use it in additional courses? If so, is the amount of time and effort significantly reduced?

Another issue with enrollment and perhaps effective usage concerns the degree to which faculty need to create their own course resources versus using resources from TBL site. As the QEP is 
marketed to more faculty, it is important to determine what perceived barriers to enrollment and persistence in using TBL exist.

Finally, one other issue related to enrollment that needs to be considered is how to motivate professors who teach courses that have high withdrawal rates and significant numbers of Ds and Fs to try TBL.  Data will be collected through faculty surveys pertaining to this issue.

Design of the Full Assessment Plan

The pilot assessment plan will be modified as necessary, but the basic design will be replicated throughout the project. As faculty members are enrolled, strategies for creating comparison groups for the course chosen will be undertaken.  If comparison groups are not available, pretest-posttest comparisons will be made.  If pretest-posttest comparisons are not possible or do not make sense, then mastery levels for course learning outcomes will be generated. This approach will allow evaluation of TBL on the course level. 

On the project level, each year, comparison groups of students drawn at random from among the students with no TBL experience will be utilized. The same approach used the in pilot stage will then be replicated comparing TBL to non-TBL students on critical thinking, engagement, and course satisfaction. Over time, data will accumulate over the years of the project, and analyses of these will be conducted mirroring yearly project comparisons.

As the number of faculty with multiple iterations using TBL increases, longitudinal analyses will be conducted to determine whether there are continued improvements in courses.  It may be that it takes two or three iterations of using TBL realize its potential.  As a result, there will be careful analysis to explore student outcomes over more than one semester use of TBL. Also, depending upon the information gained from some of the measures, some of the qualitative assessments may be eliminated in later years of the project (years 4-5).

Finally, part of the evaluation plan will be to examine changes on the NSSE and Senior Survey.  These regular surveys should show some change for our students when sufficient numbers of students have participated in TBL classes.  Items on the NESSE related to problem solving, group work, and other areas related to TBL should show changes given a large enough coverage of courses and majors by the initiative.

Did TBL Accomplish its Goal?

In year 5 of the project, the long term goal of increasing persistence in STEM fields will be examined.  Based on analysis of the majors that have used TBL, there will be conclusions drawn regarding whether or not TBL students were more likely to persist in the field than students who were not involved in TBL.  These analyses will be contingent upon the courses that use TBL, the depth of infusion of TBL into the curriculum for a major, and the effectiveness of the TBL in those courses. The retention and graduation rates of students whose major field had high levels of involvement with the TBL initiative will be compared to STEM majors who were not involved in TBL.

 

REFERENCES

Annual Report on Undergraduate Persistence (2011).  老司机福利网 Institutional   Research, Planning and Assessment.  Retrieved from   www.southalabama.edu/departments/institutionalresearch.

Baepler, P. Cotner, S., & Kellerman, A. (2008). Scratch this! The IF-AT as a technique for
stimulating discussion and exposing misconceptions. Journal of College Science   Teaching, 37(4), 48.

Banfield, V. Fagan, B., & Janes, C. (2012). Charting a new course in knowledge: creating
life-long critical care thinkers. Dynamics23(1), 24-28.

Bandura, A. (1977). Social learning theory. New York: General Learning Press.

Bloom, B. S. (1956). Taxonomy of educational objectives, handbook I: The cognitive domain. New York, NY: David McKay Co Inc.

Bransford J. D. Brown A. L. & Cocking R. R  (1999).  How People Learn: Brain, mind, experience, and school.  Washington, DC:  National Academy Press.

Bruner, J. (1986). Actual minds, possible worlds. Cambridge, MA: Harvard University Press.

Carmichael, J. (2009). Team-Based Learning enhances performance in introductory biology. 
Journal of College Science Teaching38(4), 54-61.

Characteristics of Collaborative Classrooms (2012).  University of North Carolina at Charlotte Collaborative Learning Spaces.  

Clark, M. Nguyen, H. Bray, C., & Levine, R. (2008) Team-Based Learning in an undergraduate nursing course. Journal of Nursing Education, 47(3). 111-17.

Cohen, S., & Bailey, D. (1997). What makes teams work: Group effectiveness research from the shop floor to the executive suite. Journal of Management, 23(3), 239-290.

Costa, A., & Kallick, B. (2000) Habits of Mind. A Developmental Series. Alexandria, VA:   Association for Supervision and Curriculum Development

Darling-Hammond, L. (1996). The right to learn and the advancement of teaching: Research, policy, and practice for democratic education. Educational Researcher, 25(6), 5-17.

Deutsch, M. (1949). A theory of cooperation and competition. Human Relations, 2, 129-152.

Dinan, F., J., & Frydrychowski, V., A. (1995). A team learning method for organic chemistry. Journal of Chemical Education75(5), 429-431.

Drummond, C. K. (2012). Team-Based Learning to enhance critical thinking skills in entrepreneurship education. Journal Of Entrepreneurship Education15, 57-63.

DuFour, R., & Eaker, R. (1998). Professional learning communities at work: Best practices for enhancing student achievement. Bloomington, IN: National Educational Service.

Dunaway, G. A., (2005).  Adaptation of team learning to an introductory graduate pharmacology course.  Teaching and Learning in Medicine, 17(1), 56-62.

Duke TIP (2012).  Duke University Talent Identification Program. Retrieved from http://www.dukegiftedletter.com/articles/vol6no4_feature.html.

Gomez, E. Wu, D., & Passerini, K. (2010). Computer-supported Team-Based Learning: The impact of motivation, enjoyment and team contributions on learning outcomes. Computers & Education55(1), 378-390.

Haidet, P. Morgan R. O., O鈥橫alley K., Moran B. J., & Richards B. F. (2005). A controlled trial of active versus passive learning strategies in a large group setting. Advanced Health Science Education Theory and Practice. 9(1), 15-27.

Haar, J. (2001). Providing professional development for rural educators. The Annual Convention of the National Rural Education Association, 1-14.

Haberyan, A. (2007). Team-Based Learning in an industrial/organizational psychology course. North American Journal of Psychology, 9(1), 143-152.

Holland, H. (2005). Teaching teachers: Professional development to improve student achievement. American Education Research Association, 3(1), 1-4.

Hills, H. (2001).  Team-Based Learning.  Aldershot: UK, Gower Publishing Company.  Integrated Postsecondary Education Data System.  Glossary of terms.  Retrieved from 

Jankowski, N., & Provezis, S. (2011). Making student learning evidence transparent: The state    of the art. Urbana, IL:University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  Retrieved from http://www.learningoutcomeassessment.org/TFComponentSLOS.htm.

Johnson, R. H. (1992).  The problem of defining critical thinking. In S. P. Norris (Ed.), The generalizability of critical thinking (38鈥53). New York, NY: Teachers College Press. (Reprinted in Johnson (1996).) 

Johnson, D.W., Johnson, R., & Smith, K. (1998). Active Learning: Cooperation in the college classroom. Edina, MN: Interaction Book Co. 

Johnson, D. W.,  & Johnson, R. T. (1994). Learning together and alone: Cooperative, competitive, and individualistic learning (4th ed.). Boston, MA: Allyn and Bacon.

Johnson, D. W., & Johnson, R. T. (1996). Cooperative learning and traditional American values. NASSP Bulletin 80(579), 11-18.

Johnson, D. W., Johnson, R. T., & Smith, K., A. (1998). Cooperative learning returns to college. Change 30(4), 26-35.

Johnson, D. W., Johnson, R. T., & Smith, K. A. (2007). The state of cooperative learning in postsecondary and professional settings. Educational Psychology Review 19(1), 15-29.

Johnson, D. W., Johnson, R. T., & Stanne, M. (2000). Cooperative learning methods: A meta-analysis. Retrieved from https://jamyang.wikispaces.com/file/view/Cooperative+Learning+Methods.doc.

Kalaiam, S. A., & Kasim, R. M. (2009).  A meta-analysis of the effectiveness of small-group Instruction Compared to Lecture-Based Instruction in Science, Technology, Engineering, and mathematics (STEM) college courses. 

Koles P., Nelson S., Stolfi A., Parmelee D., & Destephen D.  (2005).  Active learning in a Year 2 pathology curriculum. Medical  Education, 39(10), 1045-55. 

Kotter, J. P. (1996).  Leading Change.  Boston, MA: Harvard Business School Press. 

Kreie, J., Headrick, R., & Steiner, R. (2007). Using team learning to improve student retention. College Teaching55(2), 51-56.

Lewin, K. (1935). A dynamic theory of personality. New York, NY: McGraw-Hill.

Lewin, K. (1948).  Resolving social conflicts: Selected papers on group dynamics.  Gertrude W. Lewin (ed.). New York, NY: Harper and Row.

Louie, C., & Hargrave, S. (2006). Technology in Massachusetts Schools 2004-2005 (1-39). Malden, MA: Massachusetts Department of Education.

McInerney, M. J., & Fink, L. D. (2003).  Team-based learning enhances long-term retention and critical thinking in an undergraduate microbial physiology course.  Journal of Microbiology Education, 4(68).  Retrieved from . 

McKeachie, W. J. (1990). Research on college teaching: The historical background.  Journal of Educational Psychology, 82(2, 189-200.

Meeting 1, Jan 24th.  Approved meeting minutes.  Retrieved from https://sites.google.com/site/sacsusa2013/qep-development-committee.

Michaelsen, L. K., Knight, A. B., & Fink, L. D. (2004).  Team-Based Learning: A transformative use of small groups in college teaching.  Sterling, VA: Stylus.

Michaelsen, L. K., Parmelee, D. X., McMahon, K. K., & Levine, R. E.  (2008).  Team-Based learning for health professions education: A guide to using small groups for improving learning.  Sterling, VA: Stylus.

Michaelsen, L. K., Fink, D., & Knight, A. (2002). Team-based learning: A transformative use of small groups in college teaching. Sterling, VA: Stylus Publishing.

Michaelsen, L. K., & Sweet, M. (2008).  Team-Based Learning.  National Education Association 25(6), 1-8.  Retrieved from .

Michaelsen, L. K., & Sweet, M., (2008).  The essential elements of Team-Based Learning.  New directions for teaching and learning, 116, 7-27.  Retrieved from medsci.indiana.edu/c602web/tbl/reading/michaelsen.pdf 

Michaelsen, L. K., Watson, W.E., Cragin, J.P., & Fink, L.D. (1982). Team-Based Learning: A potential solution to the problems of large classes. Exchange: The Organizational Behavior Teaching Journal 7(4), 18-33.

Mission Statement (2012).  老司机福利网.  Retrieved from www.southalabama.edu/departments/institutionalresearch

National Center for Educational Statistics.  Students who study science, technology, engineering, and mathematics in postsecondary education.  Retrieved from .

National Institute for Learning Outcome Assessment (2012).  Making Learning Outcomes Usable & Transparent.  Retrieved from http://www.learningoutcomeassessment.org/TFComponentSLOS.htm

Overview of 老司机福利网 Student Success.  Overview of University of 老司机福利网 Alabama student success PPt.  Retrieved from OverviewofUSAStudentSuccess.pdf.

Overview for QEP.  QEP overview retrieved from OverviewforQEPDC_Jan21_2011.ppt.

Parmelee, D. X. (2010).  Team-based learning: Moving forward in curriculum innovation: A commentary.  Medical Teacher 32(2),105-107.

Perkowski, L. C., & Richards, B. F. (2007).  Team-based learning at ten medical schools: Two years later. Medical Education, 41(3), 250-257.

President鈥檚 Council of Advisors on Science and Technology (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology,   engineering, and mathematics.  Report to the President.  Retrieved from www.whitehouse.gov/sites/default/files/.../ostp/fact_sheet_final.pdf. 

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering   Education 93(3), 223-231.

Process for Selecting the QEP.  Process for selecting the quality enhancement plan.  Retrieved from ProcessforSelectingtheQualityEnhancementPlan.Feb10,11.pdf . 

Project Kaleidoscope (2012).  Excerpts from essays.  Retrieved from https://www.aacu.org/pkal/publications. 

Rich, E. (2010). Teaching commission pushes collaborative learning teams, Education Week.Org, June 29, 2010.

Rogers, E. M. (2003).  Diffusion of innovations.  New York, NY: Free Press.

Schlechty, P. (1994). Increasing student engagement.  Missouri Leadership Academy. p. 5. 

Schleicher, A. Five things I鈥檝e learned.  Retrieved from http://www.thefivethings.org/andreas-schleicher/. 

Schmoker, M. (1999).  Results: The key to continuous school improvement. Alexandria, VA: Association for Supervision and Curriculum Development 

Smart, K. L., & Csapo, N. (2003).  Team-Based Learning: promoting classroom collaboration.  Retrieved from iacis.org/iis/2003/SmartCsapo.pdf. 

Smith, B., & MacGregor, J. (1992) What is collaborative learning? In collaborative learning: A sourcebook for higher education (University Park: National Center on Postsecondary Teaching, Learning, and Assessment, Pennsylvania State University, 1992), p. 11. 

Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005).  Pedagogies of engagement: Classroom-based practices.  Journal of Engineering Education, 94(1),1-15.

Snell, M., & Janney, R. (2000). Collaborative teaming.  Baltimore, MD: Paul H. Brookes.
Springer, L., Stanne, M. E., & Donovan, S. S. (1999). Effects of small-group learning on   undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Review of Educational Research69, 21鈥51. 

Stein, B., & Haynes, A. (2011). Engaging Faculty in the Assessment and Improvement of Students' Critical Thinking Using the Critical Thinking Assessment Test. Change: The Magazine of Higher Learning, 43(2),44-49.

The University of Texas at Austin (2007).  The implementation planning model: Steps to success. Construction Industry Institute.  
Thompson, B., Schneider, V., Haidet, P., Levine, R., McMahon, K., Perkowski, L., & Richards, B. (2007) Team-based learning at ten medical schools: two years later. Medical Education, 41 (3), 250鈥257.
USA Office of Institutional Research, Planning and Assessment (2012).  University of 老司机福利网 Alabama 2012 Annual Report on Undergraduate Persistence. 

Waters, T., Marzano, R. J., & McNulty, B. (2003). Balanced leadership.  Retrieved November 11, 2007, from 

Wieman, C. (2010, June). The Learning Sciences And Learning In The Sciences鈥擳he Perspective From Post-Secondary Science Education.  Invited address at the International Society for the Learning Sciences Conference,Chicago, IL. 

Wiggins, G., & McTighe, J. (2005).  Understanding by design. Arlington, VA: Association for Supervision and Curriculum Development. 

Williams, B. C., He, B., Elger, D.F., & Schumacher, B. E. (2007). Peer evaluation as a motivator for improved team performance in bio/ag engineering design classes. International Journal of Engineering Education, 23(4), 698-704.

Vasan N. S., DeFouw, D. (2005).  Team learning in a medical gross anatomy course. Medical Education, 39(5), 439-513.

Zgheib, N. K., Simaan, J. A., & Sabra R. (2010).  Using Team-Based Learning to teach pharmacology to second year medical students improves student performance. Medical Teacher Journal, 32(2),130-150.

 

GLOSSARY

Collaborative Groups: A small group may be defined as two or more individuals who (a) interact with each other, (b) are interdependent, (c) define themselves and are defined by others as belonging to the group, (d) share norms concerning matters of common interest and participate in a system of interlocking roles, (e) influence each other, (f) find the group rewarding, and (g) pursue common goals (Johnson & Johnson, 1994).

Collaborative Learning
:鈥淐ollaborative learning鈥 is an umbrella term for a variety of educational    approaches involving joint intellectual effort by students, or students and teachers together. Usually, students are working in groups of two or more, mutually searching for understanding, solutions, or meanings, or creating a product. Collaborative learning activities vary widely, but most center on students鈥 exploration or application of the course material, not simply the teacher鈥檚 presentation or explication of it.  Collaborative learning represents a significant shift away from the typical teacher-centered or lecture-centered milieu in college classrooms. In collaborative classrooms, the lecturing/ listening/note-taking process may not disappear entirely, but it lives alongside other processes that are based in students鈥 discussion and active work with the course   material. Teachers who use collaborative learning approaches tend to think of   themselves less as expert transmitters of knowledge to students, and more as expert   designers of intellectual experiences for students-as coaches or mid-wives of a more emergent learning process (Smith & McGregor, 2002).

Collaborative Teaming: Collaborative teaming may be defined as two or more people working together toward a common goal (Snell & Janney, 2000). 

Habits of Mind: Characteristics of what intelligent people do when they are confronted with problems, the resolutions of which are not immediately apparent (Costa & Kallick, 2000).

iRAT: An iRAT is an acronym for individual Readiness Assurance Test.  This is a test given at the end of a unit to assess individual mastery of content and readiness for application of that content (Michaelsen, Fink, & Knight, 2002). 

Retention: A measure of the rate at which students persist in their educational program at an institution, expressed as a percentage. For four-year institutions, this is the percentage of first-time bachelors (or equivalent) degree-seeking undergraduates from the previous  fall who are again enrolled in the current fall. (Integrated Postsecondary Education Data System). 

STEM:Although the literature includes varied interpretations of the disciplines and careers associated with STEM, the term itself is an acronym for 鈥淪cience, Technology, Engineering and Mathematics.鈥 The term was coined by Dr. Judith Ramaley in 2001 when she was the assistant director of the education and resources directorate at the National Science Foundation.  For the purpose of this project, a STEM course is defined as any class where science, technology, engineering or mathematics is integrated into course content.

Student Engagement: Student engagement occurs when students are involved in their work, persist despite challenges and obstacles, and take visible delight in accomplishing their work (Schlechty, 1994). 

Student Learning
: Attainment of the expected knowledge, skills, attitudes, competencies, and habits of mind that students are expected to acquire at an institution of higher education (Janokowski & Provesis, 2011). 

Team-Based Learning
: A term first popularized by Larry Michaelsen to describe an educational   strategy that he developed for use in academic settings.  TBL components include strategically formed teams, readiness assurance, application activities, and peer evaluation. (Michaelsen, L.K., Watson, W.E., Cragin, J.P., and Fink, L.D.,1982). This strategy is designed to support the development of high-performance learning teams and provide opportunities for teams to engage in significant learning tasks (Smart & Csapo, 2003).

tRAT: A tRAT is an acronym for team Readiness Assurance Test.  The tRAT is identical to iRAT, but the tRAT is taken by a team immediately after the iRAT with members working on a single answer sheet (Michaelsen, Fink, & Knight, 2002).

 

Return to Top