Tuesday, October 25, 2005

Virtually better than the real thing

We hear a lot about simulations lately. It seems that they are popping up everywhere and there is no small buzz about them in the educational community. But are they really better than the real thing?

The answer to that question is an unequivocal “yes and no”. I think that whenever possible students should have hands-on learning opportunities with the real thing – in our world that means technology. The more experience they can get designing, planning, installing, configuring, maintaining and troubleshooting real technology the better. But that experience can be costly and even disastrous.

For example, not too long ago I was working with our SQL-based CRM system and I inadvertently erased the entire customer database! Oops! I must confess that it was a lesson that I will NEVER forget - but at what cost? Fortunately we were able to recover the vast majority of the data but it was expensive and disruptive to our business. So I am compelled to argue that although an extremely effective learning event, it was not an efficient or recommended learning path.

My point is although experience is often the best teacher it may not be the most efficient way to go about it. This is where simulated-labs come into focus.

Simulated labs, if done properly, can provide students with as much reality as is necessary to give them the hands on experience they need to learn real skills. The upside is that they do not have to erase the entire corporate database to learn one feature of SQL. They are other reasons that simulated-labs are a good idea. Here are a few:

First, they are simulated labs are scalable. Real technology comes in the size that it comes in with all the features and functionality intact. Simulated labs can focus on just one feature set or one aspect of the technology. I do not need an entire live network, for example, to add a user. With smaller “chunks” of functional technology hands-on learning become affordable and feasible. It can also be delivered from the desktop or laptop – something very difficult to do when you consider the size of you average computer lab, server rack or hardware lab.

Second, simulated labs are flexible. Because simulated labs are a digital recreation of the technology that they represent they can be configured in an infinite number of combinations, situations and technical configurations. In the real world to help students learn troubleshooting skills the instructor would have to literally break something on a machine so the student could diagnose the problem and then fix it. If the teacher student ratio were one-to-one and if there were an amazing hardware budget then this scenario might work. Unfortunately, this is not the case for most of us. To configure machines for 20 students for one simple exercise could take all day and that would be for only one exercise that would not even fill the entire lecture period. To adequately provide students with the hundreds if not thousands of hands-on learning experiences they need to adequately learn how to use technology simulated labs become a viable option.

Third, simulated labs can be portable. With the advent of the internet the ability to provide hands-on experience at a distance is now possible. Not too long ago, they only way to gain hands-on experience was to physically go to a computer lab. This has obviously limitations in both capacity and accessibility. Now with simulated labs, students can take a copy of their own computer lab home with them. This has significant ramifications for distance education, homework and those with disabilities.

Fourth, simulated labs are designed for learning. Real technology is not designed to teach. It is designed to accomplished the tasks it was intended to accomplish (i.e. route packets, process data, serve web content etc.). Simulated labs, on the other hand are instructional by design. Their purpose is to teach. Therefore they are designed to provide performance evaluation, feedback and even instruction. Learning scenarios can be reset and practiced over and over again. Technical processes can be artificially slowed down to demonstrate difficult concepts or time consuming processes can be sped up to not waste time waiting for the process to be completed. Try doing that with real technology!

Fifth, simulated labs are affordable. When you consider the cost of technology and the rapid rate of technical obsolescence and multiply it by the number of students that are in the program, the result is a significant budgetary issue. Unfortunately few schools are funded to the level they would like or need to be. Earlier simulations and simulated labs were very expensive and cost prohibitive to the average user. Now a online simulated labs can be found for the cost of a text book.

So, I must confess that although I am die hard technology fan and believe that hands-on experience with real technology is an essential component of the skills acquisition process it is not the only way to learn. In fact, I would dare say that hands-on learning via simulated labs is a much better way to prepare for more sophisticated experience with real technology. Let learning occur with hands-on experience with high fidelity simulated labs and wait to erase the corporate database until you are fully qualified.

Monday, October 03, 2005

Virtual Certification Testing

A Vision of the Future
You are the new site administrator for a major financial institution. Besides keeping the network running smoothly you must battle an assortment of hackers, crackers, thieves, and freaks with the goal of getting to your network data. There was a security breach only a month ago. Today you get alarming news: network throughput to headquarters in NewYork is down by 42%, packet collisions are up by 113% across the LAN, server utilization is nearing 100%, disk space is low. You receive a security alert from your audit software indicating that permissions and rights have been modified on certain secure folders. You’ve got problem–a big problem.

What if this was not only the cause of a full-scale migraine but also part of your certification exam? Science fiction? Nope. In the not-too-distant future, certification exams will do more than give a percentage or indicate a “pass” or “fail” on a single vendor technology. Certifications will present real-world, problem-based scenarios, measuring how many years experience you have and testing the skills required in a heterogeneous environment. Current strides in simulation technology make all of this possible.

Simulations Today
For years we have heard mention of simulations and how they will impact the certification industry someday. The reality is that simulations are here now and they are gaining a critical mass that will ultimately result in a quantum leap in how we are tested.

Certification providers such as Novell and Cisco have already incorporated simulations into their exams. Microsoft has announced its own plans to add simulations to its certification testing. Virtually every other major certification provider is exploring similar options.

Recently, Cisco began delivering simulations in their tests, requiring candidates to interact with simulated Cisco networks and perform real-world tasks. Cisco argues that–

“Because the scenarios used in simulations provide a more realistic environment, they are better able to measure degree of knowledge and skill; for example, that a candidate can complete a task with the right commands, in the right order, and understands when the task is complete” (emphasis added).

The Cisco announcement pitches the addition of simulation-based testing as part of the “continuous improvement” of the exam to “ensure that employers and employees continue to rely on Cisco certifications to meet the demand for technical skills in networking.” What employers demand–and what the certification industry is rushing to provide–is a certification process that certifies that students can perform the tasks of the trade rather than regurgitate answers about IT theory. The complaint of employers is that newly certified job candidates who have managed to pass certification exams are too often unable to deliver when it comes to the everyday tasks of running a network. Many employers have discovered a painful disconnect between the certification credential and the skills that only come from experience. The message is clear: “memorization is not enough,” and simulations are increasingly used to provide true “performance-based testing.”

The growing consensus among certification providers is that adding simulations to a certification program builds both the confidence of the individual and the employer in the meaning and reliability of the certification process. Test-takers seem to concur, enthusiastically embracing the addition of simulation exercises and regarding them as a more real experience.

It is important to understand, however, that not all simulations are created equal. Two very different types of simulation technologies can be used in testing environments—we call them Dumb Simulations and Intelligent Simulations.

Dumb Simulations
“Dumb” simulations (like Dumb Terminals without internal logic, rules or processing ability) basically run the user through a series of screen captures. Designers of dumb simulations simply link individual screens to recreate the simulated task. These types of simulations allow only a very finite and limited interaction with the technology, often allowing only one pre-determined path to the end result. Dumb simulations provide little logic, few rules, and limited functionality. For example, if your task were to add a new printer in Windows 2000, you would be presented with a Windows 2000 desktop with an active link to the Start button. At the extreme this button might be the only active spot on the entire desktop. After clicking the Start button, the simulation moves forward to the next screen, where a link would be defined for the Printers menu item. This process would continue until the very last screen of the simulation. Users progress through the simulation by taking the single path through the screens, clicking on the hotspots on each screen. Simulations created in this manner prevent users from deviating from the predetermined course.

Because “dumb” simulations are often created with screen captures, there are very real pressures to keep file sizes down and to keep the linking as simple as possible. In fact, most of these simulations do not define paths for wrong answers–a major limitation for evaluation purposes. If “dumb” simulations were implemented in a testing environment, users could easily discover the correct path through a process of elimination.

“Dumb” simulations are cheaper and easier to create in the short run. They lend themselves to a single program, a single task, and/or single path applications. Although “dumb” simulations may have some instructional value, they have serious limitations in the testing environment. Dumb simulations offer the designer a cheap short-cut to providing a simulation-like experience.

Intelligent Simulations
At the other end of the simulation spectrum you have what are called “Intelligent” simulations They distinguish themselves by the wealth of options presented to the user, by the dynamic nature of the simulation, and by the complexity of logic and rules that make the simulations work. These simulations generally rely on a simulation engine or simulator that is programmed to act like the real thing. For example, if you have a Windows 2000 simulator, you could have a multitude of individual simulations that work with it.

Let’s say a particular simulation asks you to add a printer. With the Windows 2000 simulator, you simply navigate through the simulated Windows environment following whichever path you desire. You could even do things that are completed unrelated to the original task.

The ability to do things other than the original task is very important in a testing environment. The system responds to your input exactly as a real system would respond. You can perform any task in the simulation that you could perform in the actual system.
With open simulations it is impossible to discover the correct path by trial and error. In fact, the path you take to achieve the task need not even be evaluated. Intelligent simulations emphasize the end result–not the path. This flexibility is a key characteristic of open simulations.

Creating open simulations typically requires a team of–
Designers, who identify the task to be simulated (usually setting up a specific network configuration), define how the simulation should work, and set criteria for successful completion.
Programmers, who take the design specifications and create the simulation programs that replicate the system environment.
Testers, who verify that the simulation functions as required.

This overhead makes producing the first simulation more expensive than producing a single “dumb” simulation. But once you have programmed the functionality, rules, commands and logic of the underlying simulators, additional simulations can be created easily with dramatically diminishing marginal costs.

“Intelligent” simulations lend themselves to simulating multiple programs, such as within a desktop or network operating system, or when integrating multiple technologies from different vendors (such as Microsoft, Cisco and Novell). The ability to test troubleshooting skills is simply not possible with any other approach. “Intelligent” simulations monitor the states of a variety of programs and respond appropriately to user input as they make changes to the system.
Simulations, the Driving Force of Change
With a basic understanding of the distinctions between simulations, there are four key trends that you should keep in mind.

Testing in a Problem-based Environment
Today, the dominant instructional design methodology for learning is based on a topical organization. Certification exams follow suit. In contrast, simulation technology allows for the learning and ultimately the testing methodology to be organized around a problem-based scenario. Dr. M. David Merrill of Utah State University, a leading instructional designer, has developed an instructional design approach he calls the “Pebble-in-the-Pond Model for Instructional Design.” This model is based on a sequence of progressively challenging simulation-based problems that engage the learner in real-world situations. This approach can easily be applied to the testing environment using a similar sequence of real-world problems. The ability to deliver real-world problems in a simulated environment will allow certification candidates to show their stuff in environments that are closer to the real thing. Creating certification exams that will be delivered in a simulated environment fuels the need to additional simulation types including hardware simulations.

Simulating Hardware
Many of the IT simulations you find on the market today simulate software, operating systems, and networks. That will not remain the case for long. Simulations are evolving to include hardware components and peripheral devices. Fueled by advances in 3D rendering, virtual reality, and gaming technologies, simulations will increasingly be developed for hardware. This should come as no surprise as you consider that the newest entrants to the greater IT certification community have grown up with Nintendo, Sega, and Sony Playstations. This group is not only comfortable navigating virtual worlds, but they are also sophisticated connoisseurs of graphical environments. Creating a wider array of simulations, including simulating hardware, opens the door to truly heterogeneous or multi-vendor simulations.

Simulating Multi-vendor Technologies
Simulations today, especially “dumb” simulations, are infamously homogeneous. They are developed to simulate a specific concept or task. “Intelligent” simulations, however, offer the opportunity for connecting simulations together to create a simulated network. If developed correctly, these linked simulations operate like the real thing. Imagine a Windows 2000 Desktop simulation connecting to a NetWare Server simulation communicating through a Cisco Router simulation. Settings can be reconfigured, things can be broken, and equipment and software can be added, all running from a DVD or CD on your laptop. This ability to create heterogeneous, simulated networks, with full-scale hardware, software, and operating systems, opens the door to new types of certification programs and training materials—products that more closely resemble the realities of the IT world and allow for the test to measure your real-world experience.

Measuring Experience
As we all know, certification score reports are articulated in percentages and pass/fail marks. Simulations will change all that. With simulation-based assessment technology and job task standards, we are not far from the time when students will not be given a pass/fail report, but rather they will be judged based on the amount of real-world experience they have. Score reports of the future will identify how many years experience students have in a particular field. You see this trend already beginning with experience-based prerequisites for courses.

Microsoft states that:

A Microsoft Windows 2000 MCSE candidate is expected to have at least one year of hands-on experience implementing and administering a network operating system to pass the exam.

Today, experience requirements are loose suggestions—in the future they will become quantifiable assets. Not only will prerequisites be stated in terms of experience, but also the outcome of training will be stated in years. Imagine your score report stating, “Congratulations, you have demonstrated 6.5 years of experience in Network Administration.”
Looking Forward
Simulations are changing the landscape of certification testing. Regardless of whether multiple-choice exams were adequate for the task, there is little doubt that the certification industry is quickly moving towards simulation-based testing. Initially driven by a concern over “Paper Certification,” the inevitable trend towards simulations is now driven by the need of certification providers to keep up with industry demands and technological innovations.

When you integrate the power of simulations into testing environments, the result is not just an alternative way of doing things. It is a fundamentally new and better way to be evaluated. IT vendors will benefit because simulation items are inherently more difficult to reproduce on brain dumps that traditional test items. Simulation items can also be created more quickly and cost effectively, without the need to create “distractors.” Candidates will benefit because simulations are a better measurement of their skills and they provide a more enjoyable testing experience that resembles their job. Employers will experience an increased confidence in the certification credential and in the people they hire. We are just beginning to see the influence of simulation technology on our industry, Similar to the magnitude of impact that photocopiers, e-mail, and cell phones have transformed the way we conduct our daily business, simulations will transform the certification industry in the near future.

eXTReMe Tracker