A Vision of the Future
You are the new site administrator for a major financial institution. Besides keeping the network running smoothly you must battle an assortment of hackers, crackers, thieves, and freaks with the goal of getting to your network data. There was a security breach only a month ago. Today you get alarming news: network throughput to headquarters in NewYork is down by 42%, packet collisions are up by 113% across the LAN, server utilization is nearing 100%, disk space is low. You receive a security alert from your audit software indicating that permissions and rights have been modified on certain secure folders. You’ve got problem–a big problem.
What if this was not only the cause of a full-scale migraine but also part of your certification exam? Science fiction? Nope. In the not-too-distant future, certification exams will do more than give a percentage or indicate a “pass” or “fail” on a single vendor technology. Certifications will present real-world, problem-based scenarios, measuring how many years experience you have and testing the skills required in a heterogeneous environment. Current strides in simulation technology make all of this possible. Simulations Today
For years we have heard mention of simulations and how they will impact the certification industry someday. The reality is that simulations are here now and they are gaining a critical mass that will ultimately result in a quantum leap in how we are tested.
Certification providers such as Novell and Cisco have already incorporated simulations into their exams. Microsoft has announced its own plans to add simulations to its certification testing. Virtually every other major certification provider is exploring similar options.
Recently, Cisco began delivering simulations in their tests, requiring candidates to interact with simulated Cisco networks and perform real-world tasks. Cisco argues that–
“Because the scenarios used in simulations provide a more realistic environment, they are better able to measure degree of knowledge and skill; for example, that a candidate can complete a task with the right commands, in the right order, and understands when the task is complete” (emphasis added).
The Cisco announcement pitches the addition of simulation-based testing as part of the “continuous improvement” of the exam to “ensure that employers and employees continue to rely on Cisco certifications to meet the demand for technical skills in networking.” What employers demand–and what the certification industry is rushing to provide–is a certification process that certifies that students can perform the tasks of the trade rather than regurgitate answers about IT theory. The complaint of employers is that newly certified job candidates who have managed to pass certification exams are too often unable to deliver when it comes to the everyday tasks of running a network. Many employers have discovered a painful disconnect between the certification credential and the skills that only come from experience. The message is clear: “memorization is not enough,” and simulations are increasingly used to provide true “performance-based testing.”
The growing consensus among certification providers is that adding simulations to a certification program builds both the confidence of the individual and the employer in the meaning and reliability of the certification process. Test-takers seem to concur, enthusiastically embracing the addition of simulation exercises and regarding them as a more real experience.
It is important to understand, however, that not all simulations are created equal. Two very different types of simulation technologies can be used in testing environments—we call them Dumb Simulations and Intelligent Simulations. Dumb Simulations
“Dumb” simulations (like Dumb Terminals without internal logic, rules or processing ability) basically run the user through a series of screen captures. Designers of dumb simulations simply link individual screens to recreate the simulated task. These types of simulations allow only a very finite and limited interaction with the technology, often allowing only one pre-determined path to the end result. Dumb simulations provide little logic, few rules, and limited functionality. For example, if your task were to add a new printer in Windows 2000, you would be presented with a Windows 2000 desktop with an active link to the Start button. At the extreme this button might be the only active spot on the entire desktop. After clicking the Start button, the simulation moves forward to the next screen, where a link would be defined for the Printers menu item. This process would continue until the very last screen of the simulation. Users progress through the simulation by taking the single path through the screens, clicking on the hotspots on each screen. Simulations created in this manner prevent users from deviating from the predetermined course.
Because “dumb” simulations are often created with screen captures, there are very real pressures to keep file sizes down and to keep the linking as simple as possible. In fact, most of these simulations do not define paths for wrong answers–a major limitation for evaluation purposes. If “dumb” simulations were implemented in a testing environment, users could easily discover the correct path through a process of elimination.
“Dumb” simulations are cheaper and easier to create in the short run. They lend themselves to a single program, a single task, and/or single path applications. Although “dumb” simulations may have some instructional value, they have serious limitations in the testing environment. Dumb simulations offer the designer a cheap short-cut to providing a simulation-like experience.Intelligent Simulations
At the other end of the simulation spectrum you have what are called “Intelligent” simulations They distinguish themselves by the wealth of options presented to the user, by the dynamic nature of the simulation, and by the complexity of logic and rules that make the simulations work. These simulations generally rely on a simulation engine or simulator that is programmed to act like the real thing. For example, if you have a Windows 2000 simulator, you could have a multitude of individual simulations that work with it.
Let’s say a particular simulation asks you to add a printer. With the Windows 2000 simulator, you simply navigate through the simulated Windows environment following whichever path you desire. You could even do things that are completed unrelated to the original task.
The ability to do things other than the original task is very important in a testing environment. The system responds to your input exactly as a real system would respond. You can perform any task in the simulation that you could perform in the actual system.
With open simulations it is impossible to discover the correct path by trial and error. In fact, the path you take to achieve the task need not even be evaluated. Intelligent simulations emphasize the end result–not the path. This flexibility is a key characteristic of open simulations.
Creating open simulations typically requires a team of–
Designers, who identify the task to be simulated (usually setting up a specific network configuration), define how the simulation should work, and set criteria for successful completion.
Programmers, who take the design specifications and create the simulation programs that replicate the system environment.
Testers, who verify that the simulation functions as required.
This overhead makes producing the first simulation more expensive than producing a single “dumb” simulation. But once you have programmed the functionality, rules, commands and logic of the underlying simulators, additional simulations can be created easily with dramatically diminishing marginal costs.
“Intelligent” simulations lend themselves to simulating multiple programs, such as within a desktop or network operating system, or when integrating multiple technologies from different vendors (such as Microsoft, Cisco and Novell). The ability to test troubleshooting skills is simply not possible with any other approach. “Intelligent” simulations monitor the states of a variety of programs and respond appropriately to user input as they make changes to the system.
Simulations, the Driving Force of Change
With a basic understanding of the distinctions between simulations, there are four key trends that you should keep in mind.Testing in a Problem-based Environment
Today, the dominant instructional design methodology for learning is based on a topical organization. Certification exams follow suit. In contrast, simulation technology allows for the learning and ultimately the testing methodology to be organized around a problem-based scenario. Dr. M. David Merrill of Utah State University, a leading instructional designer, has developed an instructional design approach he calls the “Pebble-in-the-Pond Model for Instructional Design.” This model is based on a sequence of progressively challenging simulation-based problems that engage the learner in real-world situations. This approach can easily be applied to the testing environment using a similar sequence of real-world problems. The ability to deliver real-world problems in a simulated environment will allow certification candidates to show their stuff in environments that are closer to the real thing. Creating certification exams that will be delivered in a simulated environment fuels the need to additional simulation types including hardware simulations. Simulating Hardware
Many of the IT simulations you find on the market today simulate software, operating systems, and networks. That will not remain the case for long. Simulations are evolving to include hardware components and peripheral devices. Fueled by advances in 3D rendering, virtual reality, and gaming technologies, simulations will increasingly be developed for hardware. This should come as no surprise as you consider that the newest entrants to the greater IT certification community have grown up with Nintendo, Sega, and Sony Playstations. This group is not only comfortable navigating virtual worlds, but they are also sophisticated connoisseurs of graphical environments. Creating a wider array of simulations, including simulating hardware, opens the door to truly heterogeneous or multi-vendor simulations. Simulating Multi-vendor Technologies
Simulations today, especially “dumb” simulations, are infamously homogeneous. They are developed to simulate a specific concept or task. “Intelligent” simulations, however, offer the opportunity for connecting simulations together to create a simulated network. If developed correctly, these linked simulations operate like the real thing. Imagine a Windows 2000 Desktop simulation connecting to a NetWare Server simulation communicating through a Cisco Router simulation. Settings can be reconfigured, things can be broken, and equipment and software can be added, all running from a DVD or CD on your laptop. This ability to create heterogeneous, simulated networks, with full-scale hardware, software, and operating systems, opens the door to new types of certification programs and training materials—products that more closely resemble the realities of the IT world and allow for the test to measure your real-world experience. Measuring Experience
As we all know, certification score reports are articulated in percentages and pass/fail marks. Simulations will change all that. With simulation-based assessment technology and job task standards, we are not far from the time when students will not be given a pass/fail report, but rather they will be judged based on the amount of real-world experience they have. Score reports of the future will identify how many years experience students have in a particular field. You see this trend already beginning with experience-based prerequisites for courses.
Microsoft states that:
A Microsoft Windows 2000 MCSE candidate is expected to have at least one year of hands-on experience implementing and administering a network operating system to pass the exam.
Today, experience requirements are loose suggestions—in the future they will become quantifiable assets. Not only will prerequisites be stated in terms of experience, but also the outcome of training will be stated in years. Imagine your score report stating, “Congratulations, you have demonstrated 6.5 years of experience in Network Administration.”
Simulations are changing the landscape of certification testing. Regardless of whether multiple-choice exams were adequate for the task, there is little doubt that the certification industry is quickly moving towards simulation-based testing. Initially driven by a concern over “Paper Certification,” the inevitable trend towards simulations is now driven by the need of certification providers to keep up with industry demands and technological innovations.
When you integrate the power of simulations into testing environments, the result is not just an alternative way of doing things. It is a fundamentally new and better way to be evaluated. IT vendors will benefit because simulation items are inherently more difficult to reproduce on brain dumps that traditional test items. Simulation items can also be created more quickly and cost effectively, without the need to create “distractors.” Candidates will benefit because simulations are a better measurement of their skills and they provide a more enjoyable testing experience that resembles their job. Employers will experience an increased confidence in the certification credential and in the people they hire. We are just beginning to see the influence of simulation technology on our industry, Similar to the magnitude of impact that photocopiers, e-mail, and cell phones have transformed the way we conduct our daily business, simulations will transform the certification industry in the near future.