Monday, May 29, 2006

Theological Education Journal

Note that Theological Education journal has got several recent issues devoted to discussions on assessment. Below, I have listed the relevant issues together with their content:

Autumn 1998 Vol. 35, No. 1 Models of Assessing Institutional and Educational Effectiveness: The Pilot School Project

Introduction
Daniel O. Aleshire

Developing New Evaluative Structures and Procedures
Susan E. Davies, Bangor Theological Seminary

Evaluation: Context, Lessons, and Methods
James A. Meek, Covenant Theological Seminary

Assessment and Institutional Improvement: A Case Study
David Hogue, Garrett‐Evangelical Theological Seminary

Under Review: Comments on the Reaccreditation Process Using the New ATS Accrediting Standards
William H. Brackney and R.E. Vosburgh, McMaster Divinity College

Set in Motion: The Story of Transitions at Memphis Theological Seminary
Mary Lin Hudson, Memphis Theological Seminary

Evaluation and the Educational Effectiveness Circle
Sarah Ann Sharkey, O.P., Oblate School of Theology

Assessment and Planning in a University‐Related Theological School
Dale Launderville, O.S.B., Saint John’s University School of Theology

Mission‐Focused Evaluation: A Work in Progress
Duane A. Priebe and Kathleen L. Priebe, Wartburg Theological Seminary

2003 Vol. 39, No. 1 The Character and Assessment of Learning for Religious Vocation

The Character and Assessment of Learning for Religious Vocation: M.Div. Education and Numbering the Levites
Daniel O. Aleshire

Learning Goals and the Assessment of Learning in Theological Schools: A Preliminary Survey
Gordon T. Smith and Charles M. Wood

Knowing and Caring
Charles M. Wood

Getting to the Question: Assessment and the Professional Character of Ministry
Victor J. Klimoski

What is the Literature Saying about Learning and Assessment in Higher Education?
Carolyn M. Jurkowitz

Exploring the Process of Learning and Assessment: Report on the ATS Workshop on Assessing Theological Learning
Eleanor A. Daniel

Assessing Assessment: An Accreditation Visitor’s View of ATS Outcome‐Oriented Standards
Loyde H. Hartley

2003 Vol. 39, No. 2 Institutional Assessment and Theological Education: “Navigating Our Way”

Introduction
Jeremiah J. McCarthy

Holding Itself Accountable: The Board’s Responsibility for Self-Assessment
Rebekah Burch Basinger

Presidential Assessment: The Delicate Balance
Vincent Cushing, O.F.M.

Faculty Evaluation: Conversations with Colleagues
Richard Benson, C.M.

Assessing Spiritual Formation in Christian Seminary Communities
H. Frederick Reisz, Jr.

Student Evaluation at Kenrick School of Theology
Lawrence C. Brennan, S.T.D.

Formational Initiatives at Wycliffe College
Merv Mercer

A Call to Growth: The Potential of the Profiles of Ministry Program
Francis A. Lonsway

The Pragmatics of Assessing Master of Divinity Students
William R. Myers

Assessing a Doctor of Ministry Program
Barbara Horkoff Mutch

Serendipity or Grace? What Evaluation Has Taught Us about Education and Ecclesiology in Distance Learning
Charles E. Bouchard, O.P.

Assessment of Student Learning: Some Perspectives
John H. Erickson

Assessment of Ministry Preparation to Increase Understanding
John Harris

2006, Vol. 41, No. 2 Character and Assessment of Learning for Religious Vocation

Vocation in a New Key: Spiritual Formation and the Assessment of Learning
Mary Kay Oosdyke

Speaking Assessment in the Local Vernacular
Linda Lee Clader

Leclercq among the Blue Devils: Assessing Theological Learning in the Modern University
Willie James Jennings

Progressing Toward Ministry: Student Perceptions of the Dispositional Evaluation Process
at Emmanuel School of Religion

Jack Holland

Preparing Leaders for Mission: The Experience of Assessment at Luther Seminary
James L. Boyce and Richard W. Nysse

Practicing Assessment/Resisting Assessment
Robert A. Cathey

Preaching, Proclamation, and Pedagogy: An Experiment in Integrated Assessment
Elaine Park

Moving the Mission Statement into the Classroom
Jo-Ann Badley

Evaluation Rubrics: Weaving a Coherent Fabric of Assessment
Stephen Graham, Kimberly Sangster, and Yasuyuki Kamata

Toward an Integrated Model of Assessment
Dennis H. Dirks

In addition, here is a link to the updated Theological Education Journal Index

Saturday, May 27, 2006

Asking the right questions before SWOTing around

Some time last year, I attended the Eagles Leadership Conference in Singapore. My group had Richard Mouw, president of Fuller Theological Seminary address us on the topic "Kindgom Partnerships: Serving Together in God's World"

Dr Mouw made several very profound points. One of the things that really stuck in my mind was the conversation he related which he had with a member of the Lilly Foundation. This person commented that seminaries which came to foundation asking for money really need to be asking themselves three questions:

1. What is God doing in the world?

2. What does the church need to do to align itself with what God is doing in the world?

3. What does the theological institution need to do to help the church align itself with what God is doing in the world?


Instead of doing a SWOT analysis too quickly, I wonder if we should ask these questions first before we do any form of institutional assessment?

Friday, May 26, 2006

SWOT Analysis

One of the tools which was mentioned in the seminar is the SWOT analysis. For the uninitiated, I have posted an image below which informs us what SWOT is: Strength, Weakness, Opportunities, and Threats.

The idea is to get your team together and to brain storm and list what the group perceives to be strengths, weakness (the first two are internal factors), opportunities and threats (the second two are external factors) to your institution. Out of that is born an action plan to work with!

Of course, the perspective is limited to that of the group--which itself could be a strength, a weakenss, an opportunity or a threat.
Think about that. It's critical!

For more SWOT info, click here or here.

Wednesday, May 24, 2006

Institutional Assessments

One area that we need to explore more and more is the area of institutional assessment. There was once I was waiting in surgery waiting to see my doctor. The surgery was reputable with very, very good doctors listed in their brochure. My experience with the receptionist however was very, very different. Her cold rudeness and presence seemed to overshadow all the other great services that the practice had to offer.

The analogy suggests that because we operate as a system (and often systems within systems), what is needed is commitment by all parts of the organic whole to great service. All the parts, not just individual parts, need to be committed to promote, not undermine the common vision of the institution.

The analogy also reminds me of a useful tool which was introduced to me by a Russian colleague who told me that it is regularly used in Russian agricultural training circles. The tool, which is also called the Barrel Analysis, looks at the different elements which make up the system as the staves of the barrel.

Looking at the barrel above, you know that the amount of water that the barrel can hold is dependent on the shortest stave, not the longest stave. Sometimes when we look at a curriculum, a program, a faculty, an organization, we affirm the good bits but ignore the bits which we are weak at. A good leader needs to look at the entire system, determine which areas we are good at, and channel resources developing the weak bits so that we can engage in the task of institutional capacity building.

(OK, I am aware that Christian Schwarz does use the same "Barrel Analysis" analogy in his book Natural Church Development, but I prefer the Russian version of my source.)

Another tool which I have found useful for institutional assessments is the Spider or Radar chart. In fact, both tools can be used in combination as tools to do institutional trouble shooting, problem posing and problem solving.

Interestingly, the US Navy has a site which allows you to generate Spider charts which shows in a visual matter, the performance at its different centers. It's useful just to get a sense of how spider/radar charts works.

Just one last comment. The reason why I love the barrel analysis and spider charts is because they help members of the team to visualize what the issues are. There seems to be collective ownership of the issues of the organization and a common realization of where the issues are and what needs to be done by way of appropriate interventions.

Tuesday, May 23, 2006

Authentic Assessments

Below are several definitions of "authentic assessments" provided by three experts in the area

"A form of assessment in which students are asked to perform real-world tasks that demonstrate meaningful application of essential knowledge and skills" -- Jon Mueller

"...Engaging and worthy problems or questions of importance, in which students must use knowledge to fashion performances effectively and creatively. The tasks are either replicas of or analogous to the kinds of problems faced by adult citizens and consumers or professionals in the field." -- Grant Wiggins -- (Wiggins, 1993, p. 229).

"Performance assessments call upon the examinee to demonstrate specific skills and competencies, that is, to apply the skills and knowledge they have mastered." -- Richard J. Stiggins -- (Stiggins, 1987, p. 34).


Jon Mueller has created a wonderful site called The Authentic Assessment Toolbox where he provides definitions, examples, comparisons with traditional assessments, tools, etc. This is a site worth visiting. Below I provide a screenshot of his site.

Monday, May 22, 2006

Assessing Courses and Classroom activities

We move from a focus on instructors assessing student learning to students assessing teaching processes, learning activities, etc designed. Theological education is a hyphenated word. There is the theological thinking dimension, there is the education dimension. Most of us wear our content specialization hats, but neglect our educator hats. For us to develop in our teaching skills, it is sometimes useful to receive student feedback.

One way to do it is to use the free online FAST assessment tool, a service provided by Mt Royal College, California. FAST stands for Free Assessment Summary Tool, and it can do wonders for our teaching practice.

Here are a few paragraphs from their FAQ page:
What is FAST?
FAST is an anonymous online survey tool that automatically summarizes students' impressions of a course and/or teacher and supplies the data directly to the teacher.

What does FAST do?
FAST allows a teacher to develop an online survey that students can complete 24 hours a day, 7 days a week. Teachers can ask up to 20 questions (and change them whenever they want) to determine how students are finding their teaching and the course. The software automatically summarizes and consolidates the students' comments, in real-time, on the web or into a downloadable customized Excel spreadsheet.

The one thing I have found helpful about FAST is that it provides an extensive databank of questions to which you can receive Yes/No, Likert Scale, MCQ, and long answer responses.
Constructing your online questionnaire

In the database, you will find various areas which you can include for assessment. Below you will find a screenshot of the categories, which include Activities/Exercises, Activity Teaching, Assignments/Quizzes, Clear Expectations, etc....
Question Database

What I did was clicked on "Lecture" and these questions popped up which you can select from:
Sample questions on assessing lectures found in question database

I think this is a good tool for deans and faculty to use in order to surface areas they want/need to improve on. It is possible to customize individualized assessments for faculty and for courses to help us develop in the areas we feel we need to give most attention.

Sunday, May 21, 2006

An example of use of rubric

Here is an example of the use of rubrics to assess a church's missions' involvement.


The map is available from ACME Network, and can be downloaded by clicking here

Saturday, May 20, 2006

Free Online Rubric Generators

The urls below are links to rubrics generators. These are free, online tools which help you create your rubrics real quick. They will save you a lot of time, as well as help you think through how to design a rubric.

RubiStar's Rubric Generator
RubiStar is a tool to help the teacher who wants to use rubrics but does not have the time to develop them from scratch.

While many teachers want to use rubrics or are experimenting with writing rubrics, they can be quite time-consuming to develop. RubiStar is a tool to help the teacher who wants to use rubrics but does not have the time to develop them from scratch. RubiStar provides generic rubrics that can simply be printed and used for many typical projects and research assignments. The unique thing about RubiStar, however, is that it provides these generic rubrics in a format that can be customized. The teacher can change almost all suggested texts in the rubric to make it fit their own project.

Below, you will find a video about Rubistar


teAchnology's General Rubric Generator

Chicago Public Schools' Rubric Bank
In The Rubric Bank, you will find a wide variety of performance assessment scoring rubrics. These rubrics are examples of scoring rubrics that have been used by schools, districts and state departments of education throughout the country.

LandMark Project's Rubric Builder: As teachers increasingly design online learning experiences for their students, evaluation of those activities remains a challenge. The Rubric Builder enables teachers to build effective assessment rubrics and to make them available over the World Wide Web.

Fairfax County Public Schools' Performance Assessment for Language Students (PALS)

Friday, May 19, 2006

Evaluating student work using Scoring Rubrics

In the seminar, one of the points raised about evaluating students' work was communicating expectations about what we are evaluating. One of the ways that can be communicated is through the use of scoring rubrics.

What are rubrics? (I'll get to the answer soon...)

In the meantime, here are some online articles from Practical Assessment, Research and Evaluation: A peer-reviewed electronic journal which explain and discuss issues surrounding the use of rubrics in education:

Moskal, Barbara M. (2000). Scoring rubrics: what, when and how?. Practical Assessment, Research & Evaluation, 7(3). Retrieved April 8, 2005 from http://PAREonline.net/getvn.asp?v=7&n=3

Mertler, Craig A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation, 7(25). Retrieved April 8, 2005 from http://PAREonline.net/getvn.asp?v=7&n=25

Moskal, Barbara M. (2003). Recommendations for developing classroom performance assessments and scoring rubrics. Practical Assessment, Research & Evaluation, 8(14). Retrieved April 8, 2005 from http://PAREonline.net/getvn.asp?v=8&n=14

Tierney, Robin & Marielle Simon (2004). What's still wrong with rubrics: focusing on the consistency of performance criteria across scale levels. Practical Assessment, Research & Evaluation, 9(2). Retrieved April 8, 2005 from http://PAREonline.net/getvn.asp?v=9&n=2

Apart from the above web resources, you might want to explore this book:
Rubrics - A Handbook for Construction and Use
Ed. Taggart, Germaine L., Sandra J. Phifer, Judy A. Nixon and
Marilyn Wood. Lancaster, PA: Technomic Publishing Co., Inc. 1998.

Thursday, May 18, 2006

Jane Vella's How Do They Know They Know


One of the books that Dr Lua mentioned in the course of our meetings is How Do They Know They Know : Evaluating Adult Learning (Jossey Bass Higher and Adult Education Series) authored jointly by Jane Vella, Paula Berardinelli, Jim Burrow

Here are some editorial reviews which I picked up from Amazon.com:

"Finally, a practical methodology for quantifying training results and organization impact. How Do They Know They Know is a must read for any Human Resources professional with the desire to measure the return on training investment for individuals and for their organization."Challenges the current paradigms about training evaluation and its role in the learning process. This practical, quantitative methodology for measuring training outcomes by calling for instructors and designers to begin with the And in mind. It should be required reading for all professionals in the field of education and development." (Susan DeLuca, director of human resource development, SmithKline Beecham)

"For anyone ever faced with the task of evaluating adult education programs, this book offers an accessible, practical, well researched tool that verifies results for existing or new programs. This is an invaluable tool for all educators who need to prove what knowledge, skills and attitudes have actually been learned through a course." (Sue Button, North Carolina Administrative Office of the Courts, Training and Development)

This book offers an excellent method to ensure that your training is effective amd has an impact on your organization. How Do They Know They Know's techniques place the responsibilities on training where it belongs and focuses everyone involved to cooperate to achieve the desired results. Developing training and corresponding measurements focusing on its impact on the organization is an excellent method to ensure good training design. If effective training is important to your organization, How Do They Know They Know is a must. (Clif Melvin, education specialist, Fortune 500 Computer Manufacturer)

"The authors' skills, knowledge and attitudes in evaluation of learning experiences are accessible, informed and personally engaging. This book serves as a working guide for my design and implementation of learning experiences for community workshops to college courses, and leadership training to educational support group facilitation with parents." "This is a short course in deepening Jane Vella's question "How Do They Know They Know?' They just did it! They continue to do it and have convinced their organization to do more of it!" "Accountability! Accountability! Accountability! At last I have found here a usable means to not only 'know they know' during the training, but know what I teach makes a meaningful difference in the learner's work and impacts the organization positively." (Peter Perkins, director, Prevention Unlimited, Inc)

"Books such as this are useful reminders that as the need for training programs or competency-focused curricula grows, the competition also will increase, and the institutions with objective proof of their success will succeed over the long term." (Continuing Higher Education Review)

Wednesday, May 17, 2006

Using Powerpoint to facilitate CATs

Powerpoint is a wonderful modern technological add on, but it often serves to emphasize the gap between the instructor and the student. In fact, it enhances the power of the instructor because information can be presented in multimedia forms, not just verbally, or textually.

Powerpoint however can be used to allow teachers to listen to what students have to say about a particular subject. Traditionally, it is the instructor's thinking that is most visible in the classroom. In what ways can Powerpoint be used to surface student thinking and allow their thoughts to be more visible?

This of course is a call to shift from teacher centeredness. It is an invitation for students to engage in dialogue and for instructors to begin to listen.

There are several ways of doing this, but one of the ways is to do a CATs exercise using Powerpoint. Here is an example of how it can be done from the Center for Teaching and Learning Services, University of Minnesota. Four CATs are featured here: i) Focused listing, ii) Classroom Opinion Polls, iii) Minute Paper (though here they have an adaptation - the Two Minute Paper) and iv) Muddiest Point.

These would be quick and ready to use ways of assessing student learning to see what they know or don't know, and to help the instructor determine what sort of actions need to be taken to help students acquire the knowledge, skills, competencies, and other graduate outcomes they want to see in place.

Saturday, May 13, 2006

Where does one start? The end or the beginning?

Using blogs to share resources always pose such a challenge. Because of the "diary" form of the blog, there is chronological sequencing to deal with. Which raises the question: "Shall I share resources on assessment or should I start with trying to define goals and projected outcomes?"

Well, since the impetus for this blog was our discussion on evaluation/assessment, I'll start with a really useful book called Classroom Assessment Techniques (CATS for short) by Thomas A. Angelo and K. Patricia Cross. I first came across this book when I was in a PhD seminar on Teaching and Learning in Higher Education with Dr Linda Cannell, and have since found the 50 techniques really useful to achieve teaching goals.

What is classroom assessment? Here's a short answer lifted from the website of the Center for Teaching and Learning Services, University of Minnesota.

Classroom assessment is a practice that provides instructors feedback on what and how much their students are learning. Instructors use the information they gather in this way to measure the effectiveness of their teaching practices, make decisions, and implement changes that result in better student learning.

To gather this feedback, instructors administer assessment techniques which are delivered and collected in class. These can be used at various points during the session depending on the instructor's objectives and the feedback she wishes to receive. For example, a technique used at the beginning of class might gauge students' prior knowledge of the subject matter, allowing the instructor to tailor her content delivery to the specific needs of her audience. Others may be better suited for the middle or the end of the session, helping the instructor identify how well students have grasped the material.


What I really like about this book is that it helps teachers to think through their teaching goals using a teaching goals inventory. Beyond this step, the book helps to match your teaching goals with the most appropriate CAT to help with assessments.

Like I said, there are 50 techniques that can be used, and there is a sample of them from the Southern Illinois University site (click on the links in the left hand column)

Two of the most commonly used include: The Muddiest Point and The Minute Paper. Remember, what is happening with these assessments is that we are helping students evaluate what they know or don't know with the aim of providing teacher interventions which help students to grow. The type of assessments we are dealing with here are formative assessments, not summative assessments like finals which test students but often don't attempt to help them find out what went wrong!

Friday, May 12, 2006

My reflections on the ATA Deans' seminar

So often we think in terms of the pilgrimage of the student entering our institutions. My great takeaway from this meeting was the sense that there is an openness and honesty amongst leaders in the ATA to explore the idea of the pilgrimage of the institution. Thus, in additon to the ideas of academic, ministerial, and personal formation, there has been added another category, ie institutional formation. The call to deal with institutional trauma, as well as the steps taken to offer help in curriculum development, etc are all part of the emerging consciousness of the need for institutional development/formation. The sharing of the "thick and rich descriptions" from different schools helped fire my imagination and added new benchmarks and indicators which encourage in the right direction.

A alternative view to the 20 some other group photos taken by Regina

Thursday, May 11, 2006

ATA Deans' Seminar, Singapore



What a wonderful opportunity meeting with you guys. As I suggested to Dr. Sanders, I am going to create a resource blog for colleagues. These would be educational resources which can be used to support the ministry and administry that we are engaged in.

I trust that these resources will be useful for all who visit. What I will try to do is to include some links and add commentaries to help you make sense of the resource.

One of the distinctions that Dr. Sanders made yesterday was the one between information dissemination technologies and educational or learning technologies. As long as one person controls the information in this blog, it remains very much the former. At a later stage, I want to invite some of you to contribute your thoughts, resources in this same blog. (The issue here is not about control, it is about managing the space and the voices, but you can post comments at the bottom of each post if you wish.) I think you will be able to see the difference then when two or three gather here to dialogue, share thoughts, etc. You will then experience the power of community blogging.

Until then, I hope to make this site a rich, sticky environment where you can find valuable educational resources.

Have a safe journey back home!