October 25, 2016

Three Best Things: Tinkering with Animations in Storyline

I like being inspired by other artists. I've recently noticed this interaction concept on Dribbble that seemed to transfer very well into Storyline.

Originally, I intended this as an exercise in professional stealing - for practice, I stalk Dribbble artists and secretly try to recreate what they did, but in the Storyline or in other software programs. I do it as an exercise in understanding the creative process.

So, a couple of moments later and I came up with this:


In this case I used an oversized parallelogram shape that was set to move down on the straight animation path at the start of the slide's timeline. Each slide used "Push" transition. Text has "Fade" animation.

In the original example, my focus point was on the animated yellow shape. In the original we can see the edge of the yellow shape, which moves down when the next photo slides into the screen. This is not something easily achieved in Storyline as it deals with individual slides (it's not impossible, but not easy). Apart from transitions, there are no out-of-box solutions for making the slides interact together. In this case, however, I didn't want to go for anything overly complicated. 

So, while my first version was not bad, it was still lacking the grace of the original. At the same time, I knew that I won't be able to recreate the same animation and go to bed at a reasonable time. So I had to improvise and think about what's possible in Storyline. I played around with different transitions and added extra shapes in the screen, to suggest that there are more screens coming up:

With a little shape on the right side.


I liked the movement, but the added shapes themselves felt weird to me, particularly on the first slide. The last slide was also a bit odd because it didn't have any. Perhaps, it would have been less strange if I had more slides, but for just three it wasn't really good. 

So, I kept experimenting and instead of the sideways "Push" transition, tried the up/down one. It didn't work too well at first, because, contrary to the original design, my shapes had different colors. 
The easy solution would have been to make the color of each shape the same, as I didn't really have any reason to make them different, other than for pure decorative purposes. Still, out of curiosity, I came up with another version of this interaction. I did the following:

  • Added a custom state to each shape, changing it's color to the color of the shape from the previous screen. 
  • Added a trigger to change the shape's state to normal based on the timeline (0.5 seconds).

Here's the result:


Vertical transitions and the color change.

While I was pleased with the result, the obvious weak point in this case is the reverse navigation. Specifically, if you click "Back" button, the shape colors will not match. Again, purely out of curiosity, I wanted to see if there was any way to work around that, apart from creating duplicate slides as I did previously.

With this in mind, I added a custom "Back" state to each shape and added a trigger to change the state of each shape to "Back" when the user clicks "Back" button. Here's what happened:

Going forward and then back with changing states.


It was an interesting outcome, but, in my opinion, too chaotic. As I mentioned before, there was no real reason to change the color of the animated shape and in this case I could really see that it adds more confusion / chaos to the whole concept. With this in mind, my final version of this interaction was this (click here for the published version):

Final version.

Simple yet still interesting and works quite well when going forwards and backwards (if I do say so myself). In any case, definitely an interesting technical exercise.

If you want to play around or use any of the versions of this interaction, you can download the complete Storyline file here. Or grab just the final version. Note that you will need Peace Sans font, which you can download for free.

October 16, 2016

8 Ways to Make or Break a Software Training

I've spent a lot of time facilitating, observing or receiving software training. For example, teaching someone how to use an LMS, a CMS, or a mysterious internal software. Today I want to share this:

 8-Step Plan for the Worst Software Training


  1. Sit down at the "teacher's desk" in front of the audience projecting your screen on the tiniest projector possible.
  2. Bonus points: make sure the projector is slightly out of focus, and flashes messages about the lamp needing to be changed, or other  technical errors.
  3. Be very apologetic about the complex subject matter and repeatedly remind the audience that "This is a lot to take in at once". 
  4. Bonus points: humorously blame it on "them", who didn't give you enough time to prepare or forced you to stick to a poor lesson plan.
  5. Start explaining every single element of the user interface one by one, going from left to right.
  6. Bonus points: explain EVERY SINGLE element/feature. Even those nobody is using. Make sure to talk about an element for a while, then say: "But we're never going to use it, so don't be afraid if you can't remember it".
  7. More bonus points: show and explain even the most obvious elements of the interface. Everyone needs to hear that "Save" button saves the document or that "File name" shows the name of the file. Throw in a knowledge check, too!
  8. Even more bonus points: cheer the students up by saying: "Don't worry, nobody can understand it from the start, it takes time". Or "Nobody expects you to memorize all this, I'm sure it will all become clear with time". Let them wander why are they listening to you now, rather than getting that experience.

If this sounds familiar and you want to stop apologising for your training, do this:

8-Step Path Towards Better Software Training


1. Stop placing the blame and do your job. 

This sounds harsh, but this is where you start. Don't make excuses about the quality of training! If you are an instructional designer, it is your job to take the raw subject matter in its primordial form and  transform it to make learning possible. That's what instructional design actually is. If you are the trainer, it is up to you to make the best of the material and work with the developers to make it better. In any case, any conflicts between you and the "them" have no place in the training room.

2. Stop telling learners how complex the tool is and how nobody can understand it.

This  does not motivate anyone. Stand in front of the mirror and tell yourself that you will definitely fail, but it's ok, everyone does. Does that provide a sense of relief?

3. Make a list of tasks that learners should be able to do by the end of the training.

Stop thinking "Tool training" and start thinking "Task training". In other words, forget about "They need to know..." and focus on "They need to do...". It is highly likely that the real goal of the training is not "to know the tool", but to perform specific tasks. For example: find data, correct data, cancel orders, save documents, etc.  Find out what these tasks are and write them down. Did you notice that I didn't say "learning objectives"? This is to prevent objectives like "Will be able to understand the elements of the user interface".

4. Rank the task list.

Depending on your context, you may end up with a huge list of tasks the learners need to do. In this case, talk to SMEs and the actual users of the tool to find out which tasks they do more often, or which tasks are more critical. Keep in mind that SMEs may not be reliable in their feelings about task frequency or importance. If you have access to data - use it as well. For example, in a call center, you will most likely have information about contact drivers. Map the most popular contacts to the tasks done in the tool.

If applicable, compare what you've learned from SMEs with the vision of the management who asked you to develop the training. Highlight any discrepancies. Make sure these are discussed and cleared before you proceed with the training design.

5. Design the activities that match the tasks. 

If you identified the tasks that should be performed, you probably have a good idea on what activities you need. The main challenge here is often a sequence of activities, particularly for complex tools.

To design a sequence of learning activities I consider the following about the task I am teaching: frequency, ease, tool elements and functions used. As a hypothetical example: opening a customer's account is easy and is done very often. Investigating a fraudulent activity on a hacked account is complicated and rare. Thus, opening customer's account should be the first task/activity to learn in this case.

Keep in mind the cognitive load as well. The learners should be comfortable with the basics of the tool before dealing with tasks that need analytical thinking or creative application. For example, for those who have never created an e-learning, leaning how to create simple shapes and inserting pre-made buttons in Storyline should probably precede learning how to create custom buttons with your own triggers.

6. Engage your learners early on. 

Provide instructions and guidance for anything that's not obvious, but let learners figure out information that is more obvious. With some tools, what the learners need to know is the procedure: the sequence of actions they should do. What they can often (well, depending on your tool) figure out is to how to do these actions in the tool. In this case, give learners the list of actions to do, such as:

  1. Open customer's account, 
  2. Open customer's address 
  3. Change address 

and allow the learner to find  the right buttons. Especially if the buttons names are "open account", "open address", "change address". If you're making an e-learning, add an option to get a hint on demand. In the classroom encourage peer support between  advanced learners and focus your attention on the others.

Another option to teach a tool is to provide a step-by-step written guide (think tutorials) and ask the learners to follow the steps in the actual tool on their own. In an e-learning, consider adding on-demand videos for learners to play if they need help. Instead of only watching someone doing something, learners will actually be doing it themselves. The benefit of this approach in a classroom is that it frees you up from lecturing and lets you manage the class by walking around and helping learners as needed.

7. Include activities that learners will complete on their own.

Activities should be relevant to the learning objectives and rely on what has been learned, but should be slightly different from what has been taught in the course. These should be completed without guidance. The purpose of this is a) to practice and b) to verity the learning. These activities does not have to happen immediately. In fact, distributed practice works even better.

8.Encourage your learners. 

Instead of constantly highlighting that everyone is an idiot (and that's what the 'encouragement' like "nobody can understand it the first time" really says), highlight their successes, reassure them that they are doing great (if they do), provide developmental feedback, if they don't.


Got any more ideas to make or break a software training? Disagree with something? Let me know in the comments or Twitter - @GamayunTraining

October 9, 2016

Creating Adaptive Menu in Storyline

I have always been fascinated by the idea of learning objects and adaptive learning systems. This interest is partially selfish - as a learner I rebel against training materials that force me to click through every single thing on every single slide, without letting me skip anything that I might not need, or already know.

For this week's ELH Challenge, I decided to approach the topic of checklists from my favorite perspective of adaptability. In my concept demo, I am assuming an e-learning module with three chapters.  I included three questions, which represent a pre-test on the training content. Based on the results of each asessment, the module chapters are shown as "mastered", "suggested for review" or highlighted as "focus points" for learning.


Animated menu


In this post I will explain how the concept was created in the Storyline 2. However,  the biggest challenge of this concept (assuming you would like to flesh it out) is not the development of the list, but the instructional design. Mostly because you will need to create a good assessment that will actually measure the learner's knowledge of the subject matter. Note that you will need to have as many sets of questions as you will have chapters in your e-learning.

I will assume that you are not aiming to report the results of this pre-assessment to the LMS or otherwise track it, as it's not an intended purpose of this interaction.

Creating Assessment 

There are two ways to implement the assessment:


The former will give you more control, the latter will save time.

If you wish to draw questions randomly, divide them into questions banks (one bank per chapter) and add random draws from question banks to your project. It is interesting to note that you can add any slide to the question bank. You don't have to use the built-in question slides or freeform interactions.

If you're not using your own variables, you will also need to create a separate scene and add three blank results slides - one slide per each set of questions. Essentially, you're aiming for this:



It doesn't matter how your slide results look, since they will not be visited by learners - this is the reason they are in a separate scene. The only reason we need them is to have the quiz result variables. Specifically, we'll be interested in ResultsX.ScorePercent or ResultsX.ScorePoints - depending on whether you want to use percentage or points for the final scoring.


Creating the Menu


Fancy Checkboxes

You will notice that self-ticking checkboxes are animated. To recreate this, first, create three separate objects: a "Normal" checkbox, a "Filled" checkbox, and a Check Mark. Note that I used a regular shape rather than a button.

Create a custom state for "Normal" checkbox - let's call it "Checked".
Add desired animations to the "Filled" checkbox and the Check Mark. Select both of these objects, cut (Ctrl+X) and paste them into the "Checked" state of the "Normal" checkbox. In other words, you'll have three objects inside an object:
Check box state using two additional objects.
Don't forget to create three custom states to signal the chapter status, for example:

Custom states of the "Chapter status" label

I used "Shape" animation for the filled checkbox, "Grow" for the check mark, and "Wipe" for the chapter status labels. I do recommend using animations for elements that need to turn from invisible into one of several possible states (such as chapter status labels in this example). Otherwise before changing to the custom state, they will flicker the "Normal" state for a second. Adding even a tiny bit of "Fade" solves this issue. Of course, you can try to create a "normal" state that is invisible, but consider animations, too.

Create the rest of the objects and adjust their positions on the timeline, as you wish. In my example, I have arranged everything as follows:

The elements of one chapter (title, background, status, checkbox) are grouped together and arranged on the timeline. The group is then copied to preserve the timings.

To save time, I would recommend creating the elements for the first chapter "card", add animations, timeline positions, etc. Then group the elements and copy/paste the group as many times as you have chapters. This way you will keep all properties and timeline positions of all the elements within a group - no need to adjust each of the elements over and over.

This is the way I created the "spinner" on the "Processing results" slide: I created and timed a group of 4 circles, grouped them and pasted the group several times on top of itself.

Triggers

Finally, create the triggers to change the states of checkboxes and  chapter status labels, based on the assessment results. For example:

The checkbox will be set to status "Completed" if the learner scores 85% or more on the first assessment.
You're done!

Not Visiting Results Slides?

Yes, you don't actually have to visit the results slide to generate the quiz results. You can, if you like, create a "Loading screen" slide and add the "Submit quiz results" trigger to it (it will work for more than one assessment):



While I haven't tested this concept on an LMS, I haven't had any issues with the menu items changing states with the described setup when viewing the published output online. As I mentioned before, the ability to report these results was not intended for this interaction, but if you need to report these results, feel free to experiment with the results slides or any other workarounds and let me know what you find out.

Good Assessment: What is "Good"?

As instructional designers, we often have to either write ourselves or use SME-written assessments, quizzes and other varieties of multiple (or not so multiple) choice questions. And often we're horrible at it. Before investing time into learning about the principles behind assessment writing, I did every single mistake that was possible, once again proving that learning without theory is not learning.

A Good Assessment is...

While proper evaluation of assessment quality, which would be necessary for high-stakes assessment, requires specialised knowledge and skills, in my daily life I found it extremely helpful to be aware of the underlying complexity of assessment development. Which is why in this post I would like to provide a simplified overview of assessment quality criteria, as even basic knowledge of these helped me design better assessments.

So, in simple terms, an assessment is good, when it is valid and reliable.

Understanding Validity

Validity, put simply, means that your assessment is measuring what it is supposed to. There are multiple ways of measuring the validity of your test, but they ultimately depend on the purpose of your assessment, which should always be formulated clearly. For example, your assessment may claim that anyone who scores X out of Y points will be able to perform a job or a task they were taught. In this case,in order for your assessment to be valid, you need to confirm the correlation between the test score and the job performance. If you find that people who failed your assessment are performing just as good as anyone who passed, your assessment is most likely invalid.

More often, however, you will most likely want to assess the achievement of learning objectives. In this case, the validity of  your assessment can be evaluated by verifying if:

  • Questions are linked to the learning objectives. In fact, each test item should measure only one learning objective or only one "thing". While it may seem like a good idea to create complex questions that relate to two or more learning objectives, this creates an issue with identifying the reasons behind failing the question. How can you know which learning objective wasn't achieved? 
  • There is an acceptable proportion of questions per each objective. For example, a learning objective "Will be able to save a document" can most likely be assessed with one test question, while "Will be able to troubleshoot technical issues" most certainly needs more than one. 
  • The cognitive level of test items is adequate. For example, if your objective is "To analyse the impact of World War II on global economy", the question like "In which year was World War II started" is not valid for this learning objective. This is the case where Bloom's taxonomy and well-formulated learning objectives are vital.
  • The difficulty of test items and the assessment as a whole is appropriate. While defining the difficulty can be quite complex, at the very least you can ask a group of SME  as well as someone who most closely resembles the assessment target audience, to review your assessment draft and establish the level of difficulty.  


Understanding Reliability

A valid test measures what it's supposed to. Reliable test does it constantly. For a quick example of the difference between validity and reliability, take a look at this ruler:



It is very reliable (it will constantly measure the same distance), but invalid (because it actually has 8 centimeters instead of the declared 9).

Assessment reliability measurement is usually complex and requires specific knowledge. Therefore, if you need to create a high-stakes assessment, such as a final exam or a certification program which has impact on the learners' employment or study outcomes, I would highly recommend hiring a professionally trained assessment specialist to assist you in this process. However, for the development of low-stakes assessment it is helpful to keep in mind the general principles.

Generally speaking, assessment reliability can be verified by:

  • Test/Retest - administering the same test to the same audience over the period of time. For example, if you have ever taken the same personality test several times and got a different result each time (and there's nothing in your life that could dramatically affect your personality), then the test may not be reliable.  
  • Equivalence - successive administration of the parallel forms of the same test to two groups. In very simple terms, this means that you will need to create two copies of the test, with questions that are different, but assess the same learning objectives / skills / knowledge areas.
  • Item reliability - a degree to which the test questions that address the same learning objective / skill / knowledge produce similar results. 
  • When evaluating test questions, it is also helpful to consider if the item discrimination, or can the item identify good performers from the poor. For example, are there test questions that high performers find constantly confusing? 
  • Inter-rater reliability - if you're using human graders (for essay questions or observations), you should verify to which extent the grades for each test items can vary between individual graders.

Based on my practical experience with assessments following internal on-boarding programs, the reliability of assessment can be compromised by unreliable training processes. This is particularly true if your training program is being run across multiple locations, by training teams that are not operating jointly or even belong to different legal entities within your corporation. Local issues, miscommunication between departments, lack of support for trainers - all of these factors should be considered before making conclusion about either quality of the assessment or the learners' performance.

Check Your LMS

It may not always be obvious, but some learning management systems (LMS) can actually help you assess your assessment. Particularly Moodle is extremely helpful in this regard. For example, it can identify test questions that have poor discrimination or inadequate difficulty. At the very least, your LMS should provide you with the percentage of learners answering each question correctly/incorrectly (and this is what is called "difficulty") - pay attention to questions which have extreme percentages (whether low or high) and particularly the 50/50 splits.  This can indicate either an issue with the question, or an issue outside of the test, e.g. an outdated or inadequate training.

The Worst Practice to Avoid

Creating an assessment that has 20 questions and a passing grade of 80% without any reason behind it, other than this being the so-called "best practice". Plainly speaking, neither the number of test items, nor passing grade should be set arbitrarily, as this compromises the validity of the assessment. In fact, assessment based solely on these two or either one of the numbers, usually leads to disputes and confusion, as there is no clarity of what these numbers mean or what is the purpose of assessment. There are specific principles that can help you make a reasonable decisions about the number of test items, as well as the passing grade - these should be applied to all assessments.

In general, having poorly written assessment is at the very least a waste of time (and at worst - highly unethical or even illegal), both yours as an assessment writer, and the learner's. In this case, no assessment is better than a bad assessment!

Further Reading

In addition to this extensive summary of the views on validity and reliability and this (less academic) overview, I found the following printed resources helpful, practical and accessible to more general audiences:


  • Anderson, P., Morgan, G. (2008) ‘Developing Tests and Questionnaires for a National Assessment of Educational Achievement’, vol. 2, World Bank Publication.
  • Burton et al. (1991) ‘How to Prepare Better Multiple-Choice Test Items: Guidelines for University Faculty’, Brigham Young University
  • Downing, S., Haladyna, T. (1997) ‘Test Item Development: Validity Evidence From Quality Assurance Procedures’, Applied Measurement in Education, nr. 10, vol. 1, pp.61-82.

Most of them may be found through Google Scholar.

August 14, 2016

ELH Challenge 140: Slide Transitions and Back Navigation

For this week's ELH Challenge (ADDIE model) I wanted to make something short. Therefore, highly interactive branching scenarios were out of scope, as well as anything that would require custom writing (as you can see by the frequency of my blogging, I'm not exactly a fan of writing in my spare time). Therefore, I settled on this demo which describes ADDIE phases and touches upon their possible combinations/order.

Since my submission was relatively straightforward (no variables here!), in this post I would like to talk about one of the aspects that is probably very easily missed - back navigation (what happens when the learner wants to go back). In the context of this challenge, I will focus specifically on slide transitions and the impact of back navigation on my design choices for this ELH Challenge.

While most of the time back navigation is straightforward, in certain cases, particularly if you're tailoring animations and slide transitions to make your course look good while the learner progresses towards the end, overlooking back navigation may be detrimental to the experience you designed. For instance, let's review the slide transitions in my submission:

Final version of navigation + transitions

As you will notice, when we move from the title slide ("ADDIE") to the "Analyze" slide, we zoom into it. The slides for the other phases (Design, Develop, etc.) use "Cover: From Left" transition instead. My logic behind this choice was to create a feeling that we are "zooming" (= looking closer) into the ADDIE abbreviation and then move from letter to letter to explore their meaning. This is why the first slide use "zooming", while the other slides use "Cover" (more on "why Cover and not something else?" later).

In the example above you will also notice, that if we go to slide "Design" and then navigate back, the slide "Analyze" will use a different transition. Since I was trying to create a feeling of moving between letters, I wanted to avoid this: 

Immersion-breaking transitions for back navigation


This is not disastrous, of course, but it does contradict the idea I am trying to communicate and is quite jarring.

To ensure smooth back navigation, I added a duplicated "Analyze" slide and adjusted the navigation tied to the player buttons:

Better user experience but more maintenance


As you can see, the only way to get to slide 1.2 is from the title screen. All other navigation paths include slide 1.3. (which is an exact duplicate of slide 1.2 but with a different transition animation).

This might not be the best solution for the courses where you would need to update the course content very often (as you will need to make corrections in both copies of the slide), but for a stable content this might be a good workaround which would not require too many triggers/conditions or animation paths.

As mentioned before, I should note the reasoning behind the "Cover" transition for other slides. Usually, to create an illusion of movement or uninterrupted flow between slides, I would recommend using "Push" transition, particularly if you're not allowing back navigation. However, look at what happens if I use "Push" transition in this case (remember, we want to feel that we're moving from left to right of the "ADDIE" abbreviation):


Confusing transitions for back navigation

As you can see, this presents two issues. Firstly, the off-slide parts of the letters are cut-off, creating a rather ugly transition effect. Of course, this could be fixed by creating a different design, for example, by placing the letters so that they do not bleed off the left or the right edge of the screen. However, there is a second issue. When you click "Next", the slides are moving to the right, which is also the the direction that the "Next" button is indicating. When you click "Previous", you subconsciously expect that the previous slide will slide into the frame from the left slide, pushing the currently displayed side to the right. However, this doesn't happen. Instead, when you click "Previous", you see the same animation as if you were be navigating forward. I found this highly confusing and I'm confused by my own module, that's usually a sign that I need a different design solution! 

With this in mind, after trying different combinations of transitions, I settled with "Cover", as it seemed to be the least confusing of all. Of course, there is a chance to create an amazing transition with the help of animation paths and other effects, but this will be my exploration opportunity for the next challenge.

Do you know a better way or have seen examples with great experience for back navigation? Drop me a line in the comments (or Twitter - @GamayunTraining).

July 26, 2016

Confessions of the Portfolio Newbie

ELH Challenge 138 asked us to answer the questions about our portfolios. As I now technically have one and I've also had some experience reviewing the work of others, I decided to jump in and record my answers as well. I hope these will be useful or inspirational for those who are just staring or just thinking about creating a portfolio.

Below you will find short summaries of my answers - you can hear the extended versions by playing the embedded audio tracks.


Tell us a little about yourself. How did you get started in e-learning? How long have you been in the industry?


I graduated with MA in Teaching English as a Foreign Language, worked in the Learning & Development area and gradually grew from an instructional designer + trainer to instructional designer + e-learning developer. In addition I now have my second MA in Online and Distance Learning. 


Tell us about your e-learning portfolio. What types of projects do you include in your portfolio? How often do you update your portfolio?



I've started creating a portfolio in April this year, and I use ELH Challenges to fill it. I update it almost weekly, or whenever I have another challenge done. At the moment I'm creating a "critical mass" of content and I will take a more critical look at it once I have a clearer goal for my portfolio. I decided to make it a work in progress rather than waiting for the moment when it becomes a perfection.



What do you think makes a good online portfolio? What should and shouldn’t be included in an e-learning portfolio?



I think about portfolios as resumes, but with less words. For me they represent me and the work that I would want to do, so that's the answer to the question of what should be there. As to what shouldn't - my pet peeve is plagiarism (seems obvious, but sadly, it happens) as well as examples that do not support your image as an instructional designer.

On the subject of plagiarism, I've created ELH entries based on the work of others, for example, my "Catman GO" and an App-Style Navigation examples. However, I reworked the originals, as well as clearly credited my sources.


What do you think clients or companies look for in an e-learning portfolio?



In short - whether the works presented in the portfolio match their needs. Some examples from my past are:
  • whether an instructional designer can match the style of the courses that exist in the company
  • and/or if they can follow branding guidelines so that their work doesn't contract with the existing course base.
  • do the portfolio items show instructional design and pedagogical thought


What platform or technology did you use to build your e-learning portfolio?



Since I'm still at the beginning of my Project Portfolio, I'm simply using my Articulate Profile as a front page. I host my modules on AWS and occasionally write in this blog.


What’s the most challenging part about building, designing, or maintaining portfolios?



Starting, by which I mean finding time and fighting perfectionism. It was worth it, though, as I have my portfolio to thank for my new job.


How do you handle confidentiality issues with projects in your portfolio?



I fill my portfolio with ELH Challenges. Two of my entries are based on the actual trainings (Elevator Speech and Course Personalisation), but these have been reworked beyond recognition, based on what I remember about these projects, as well as filled with my own content.


What are your top three tips for users looking to build their first e-learning portfolio?



Do it. Particularly if you're bad at self-marketing, portfolio will better showcase your achievements and skills.

If you feel like you don't have time and energy - gather solid data (not hunches) on where you time goes. Then try reducing the amount of things you hate doing and/or things that do not benefit you directly, or that are not in line with your goals and values. Get "a room of your own" in the sense of free time and use it to build something that will show the world who you are.

Get good feedback. Ideally, get a mentor or reach out to someone whose opinion you value. Talk to me on LinkedIn, if you like.




July 21, 2016

How to Create a Stealth Action Puzzle Game in Storyline 2

In this post I wanted to describe the design and development process of my submission for ELH Challenge 137 - a remake of the computer game Hitman Go (if you haven't played the game, consider looking at this gameplay video on YouTtube). I'm going to outline some pre-history of the project, describe my design process and what was going on in my head, what I spent time on, which elements of the submission had what reasoning behind them and why, as well as talk about the actual development of the module in Storyline 2.

The 3rd level of Catman Go (working title)

Background


I started experimenting with this idea a while ago, aiming not to think about the visual design and focus solely on the technical functionality. I also wanted to try and make a board that would scale or scroll (like in this great example by Rais Omar) and the game piece that could be dragged. If you're curious, click here for a published slide from my original experiment.

Although I learned a lot and I was happy that I did this experiment, it turned out to be too much work for a "stupid side project", particularly when it came to designing an elegant solution for preventing the player to drag the piece straight to the exit. Thus I shelved it and made a mental note to come back to it sometime later. Thankfully, ELH Challenge helped me revisit this project and implement the things I learned in its new version.

For the Challenge, since wanted to have a finished product without too many shortcuts, I decided to narrow down the scope of my creativity and cut down the number of features. Specifically, I decided to get rid of dragging and dropping and abandon the idea of a scroll-able board.

Design 

I started my design process with a research. Essentially, I did a lot of image searching in Google, Dribble and stock image websites, to saturate myself with images of game boards and screen-clipped anything I liked into OneNote.

With this done, I started designing the layout of the largest level in Inkscape (a free open-source alternative to Adobe Illustrator). Since I abandoned the idea of a scroll-able board, I needed to know how much real estate will be occupied by the essential elements: fields, pieces and goal tracking.

With two levels compared, I had an interesting problem of having too much real estate on the first and just right on the last level. Therefore, I needed something to fill the void on the first level without breaking the unity of the overall design. After playing around with various wild ideas, I settled on the design that used the field grid from the last level with the extra fields being clearly unavailable. I achieved the effect by making the fields semi-transparent and removing the gray connectors between them.

As a side-note - while the need to have fields and pieces is quite obvious, you might wonder why goal tracking was important in such a simple game. Well, when playing Hitman Go, it took me approximately 10 or more levels to figure out where I can find the level goals... Since I consider myself a person of a reasonable cognitive ability, and I only had time to create three levels at most, I needed to make sure that my users don't miss the goals. In addition, as you probably know, it's a good instructional design practice to keep the task visible or within an easy reach of the learner.

With all the elements created and arranged into the final level layout, I used Palettable to create a color-scheme matching the mood of the game, which I then applied to my design. Well, technically, I came up with 4 color schemes and engaged my family and friends to select the ones they liked.

The logic behind the final coloring was this:
  • Movement (beige) - the fields themselves and circles along the path
  • Direction (gray) - the connectors between fields
  • Focus points (black) - the cat itself and the arrows indicating the lines of sight
  • Goals (white) - mice and goal text
  • Background (blue)
Surprisingly, I spent quite a lot of time choosing fonts for an interaction that has a minimum amount of text. Below is the original font line-up I was choosing from (as well as an example of an alternative color scheme):

Early font candidates

I discarded all the fonts that looked too straightforward/elegant/boring for the context of a game fit for a kid (such as Raleway). Then I discarded the fonts that looked hand-written (e.g. Apple Boy) or were too elaborate, since, in my opinion, they did not agree well with a precise and geometric design of the game board. The final candidates were Perfograma and Showcard Gothic. As you can see, Perfograma looked cool but still too sophisticated for a game like this - not fun at all:

I think it wants to spell "No fun allowed"

In comparison, Showcard Gothic was definitely winning the battle, particularly when paired with another font (in this case - GrilledCheese) and a little sad face:

Nothing says "fun" as a sad face, right?

Development 

To avoid recreating all of the elements in the Storyline, I exported them from Inkscape as .png files instead. While a time-saver for development, it is definitely a time-sink for any future changes. For example, if you use one and the same image for 20 objects, Storyline does not let you replace that image with one click (unlike Captivate or Construct 2, which store images as global assets rather than individual objects) and you have to replace every single object (unless you want to mess around with the published output to replace the images there, or resort to other hacks). Do this only if your art is too complex to be created in Storyline, or if you're really sure that nothing is going to change in terms of colors or shapes.

Since I designed all my levels in Inkscape, I knew exactly how the end project would look like, therefore I started with the biggest level and worked backwards from it. This way, instead of adding more triggers I had to delete unneeded objects and then delete any triggers with "Unassigned" values in them. The rest of the triggers I simply adjusted to match the demands of the level.

Let's take a look at how a level is built.

Objects


Each field is an object with three states, as you can see below:


By default, all fields are set to "Disabled", except the fields into which the player can move. This was done to ensure that the player doesn't just click on the "Exit" field from the start.

Every field had a number assigned to it and reflected in its name on the timeline. To keep my sanity and make the development and testing faster, I added a separate layer with the field numbers to serve as an overlay:


I kept the numbers consistent throughout the module, which helped me greatly in reusing the triggers. The only difference is the Exit field, which is always 99, as it had a different set of triggers.

The mice are separate objects. I thought about adding them to button states, but decided that this would complicate an already complicated system and added them on top of playing fields instead. The mice have the same numbers as the fields they are standing on.

Variables and Triggers

Technically, the game uses only two types of variables: one variable to track the currently active field and a true/false variable to track the state of each mouse. Here's the list of variables actually used in the module:



The variable "FieldSelected" changes when the player clicks on a field to move the cat. For example, if a player clicks on Field 12, the variable value is set to 12. The same trigger was added to mice as well, as otherwise the player would have to click the areas around the mice. For mice, however, you need to make sure to add a limiting conditions to the trigger, as otherwise players can click on each mice whenever they want:

Mouse can be clicked only if the field it's on is enabled / active.

Alternatively, you can set the mice to state "Disabled" and create triggers to enable them, based on the value of "FieldSelected" variable. This way you'd have to make more triggers, but on the upside, the player won't be able to see the "hand" cursor when the cursor hovers over the mice.

In order to enable/disable all buttons correctly, you need to write two slide triggers for each button:
  • Change field state to "Normal", if the value of the "FieldSelected" variable equals to the value of the nearby fields.
  • Change field state to "Disabled" if the value of "FieldSelected" does NOT equal to the value of any nearby fields. Note that in this case, if you need to have several conditions, the logic should be AND, not OR.
For example, in the situation below, Field 13 would only be enabled if "FieldSelected" equals 12, as Field 12 is the only field from which a player can reach field 13:

Playing fields

Consequently, if "FieldSelected" does not equal 12, Field 13 should be disabled.

For each mouse, you will need to create three triggers, following this example:

Mouse triggers

Essentially, you need a trigger to interrupt the gameplay if the mouse is not dead when the player moves to a field it's observing (in this case, field 11). Secondly, you need a trigger to "eat" the mouse (variable MouseDead = true), if the player sneaks up on it. Thirdly, you need a trigger to hide the mouse once it's been "eaten" (variable "MouseDead" changed to true).

In other words, you'll need two triggers per each field to manage its states, one trigger per field to adjust "FieldSelected" variable and three triggers per mouse to interrupt the gameplay or manage the state of the mouse.

Final Touches

I was listening to Cat Purr Generator while working on this example and I wanted to add some of that ambiance to the game. I didn't want to spend a lot of time on looking for matching royalty-free soundtracks, therefore I decided to limit the audio to occasional "feedback" sounds: a meow when the cat eats a mouse, a hiss in case of failure and a content purr at the end of the level.

Lessons Learned

  • Set a clear and manageable scope for your project.
  • Design for largest and smallest amount of elements on the screen.
  • Start development with a very clear goal in mind.
  • Work from hardest to easiest (may not always be true, but applies in this case).