Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

L3 Lead Examiner

Report 1906
June 2019
L3 Qualification in Engineering
Unit 6: Microcontroller Systems for Engineers

31725H
Edexcel and BTEC Qualifications

Edexcel and BTEC qualifications come from Pearson, the world’s leading learning company. We
provide a wide range of qualifications including academic, vocational, occupational and specific
programmes for employers. For further information visit our qualifications website at
http://qualifications.pearson.com/en/home.html for our BTEC qualifications.

Alternatively, you can get in touch with us using the details on our contact us page at
http://qualifications.pearson.com/en/contact-us.html

If you have any subject specific questions about this specification that require the help of a
subject specialist, you can speak directly to the subject team at Pearson. Their contact details
can be found on this link: http://qualifications.pearson.com/en/support/support-for-
you/teachers.html

You can also use our online Ask the Expert service at https://www.edexcelonline.com You will
need an Edexcel Online username and password to access this service.

Pearson: helping people progress, everywhere

Our aim is to help everyone progress in their lives through education. We believe in
every kind of learning, for all kinds of people, wherever they are in the world. We’ve
been involved in education for over 150 years, and by working across 70 countries, in
100 languages, we have built an international reputation for our commitment to high
standards and raising achievement through innovation in education. Find out more
about how we can help you and your learners at: www.pearson.com/uk

June 2019
Publications Code 31725H_1906_ER
All the material in this publication is copyright
© Pearson Education Ltd 2019
Grade Boundaries

What is a grade boundary?

A grade boundary is where we set the level of achievement required to obtain a certain grade
for the externally assessed unit. We set grade boundaries for each grade, at Distinction, Merit
and Pass.

Setting grade boundaries

When we set grade boundaries, we look at the performance of every learner who took the
external assessment. When we can see the full picture of performance, our experts are then
able to decide where best to place the grade boundaries – this means that they decide what the
lowest possible mark is for a particular grade.

When our experts set the grade boundaries, they make sure that learners receive grades which
reflect their ability. Awarding grade boundaries is conducted to ensure learners achieve the
grade they deserve to achieve, irrespective of variation in the external assessment.

Variations in external assessments

Each external assessment we set asks different questions and may assess different parts of the
unit content outlined in the specification. It would be unfair to learners if we set the same
grade boundaries for each assessment, because then it would not take accessibility into
account.

Grade boundaries for this, and all other papers, are on the website via this link:

http://qualifications.pearson.com/en/support/support-topics/results-certification/grade-
boundaries.html

Unit 6: Microcontroller Systems for Engineers

Level 3
Grade Unclassified
N P M D

Boundary
Mark 0 22 37 53
11
Introduction
Unit 6 (Microcontroller systems for Engineers) is a mandatory unit that requires learners to
complete a set task to control a system using a microcontroller. There are six activities to
complete for the whole task. This was the first live task for this unit and learners were required
to produce a monitoring system for a supermarket refrigerated storage area.

The external assessment task is structured to address the assessment outcomes for the unit.
The assessment outcomes are:

AO1 Demonstrate knowledge and understanding of computer coding principles, electronic


hardware
components and the development process

AO2 Apply knowledge and understanding of computer coding principles, electronic hardware
components and of the development process to design and create a physical computer system
to
meet a client brief

AO3 Analyse test results and evaluate evidence to optimise the performance of a physical
computer
system throughout the development process

AO4 Be able to develop a physical computer system to meet a client brief with appropriate
Justification

There is a marking grid for each of the six activities that make up the whole task. Examiners
allocate marks to the assessment evidence provided by the learners, for each of the six
activities, using a holistic ‘best-fit’ approach. They compare the evidence for each activity to the
corresponding marking grid and the bands/descriptor bullet points within.

Please note that all of the examples of learner assessment evidence provided in this report are
extracts. As a result, they can only be considered to be representative of evidence that would
be awarded a mark from a certain band. In reality, all of the assessment evidence for a given
activity (which is generally quite extensive) must be considered when awarding a mark for that
activity.

Learners are required to submit the Part S task booklet for marking. This was in the form of an
electronic document saved as a PDF. In addition, learners submitted an audio/visual recording
showing the system in operation. A time limit of three minutes is set for the audio/visual
evidence.

Learners should ensure that the assessment evidence is placed in the correct activity section of
the booklet, as evidence will only be considered for each task within the relevant section.
Introduction to the Overall Performance of the Unit

The task for 1906 was in the main a sensing system. The learners had to create a system that
would monitor inputs and count any changes, up or down and translate this into a waiting
time. This in itself is not too difficult. However, the monitoring and counting system had to be
continuous; i.e. while the ride was running. This part was a timing task, which learners could
have experienced if they have attempted the previous series task and the egg timer sample
assessment material as practice tasks. The additional sample assessment material, available
online, also provided some relevance to the task as it also sensed and counted items for an
injection moulding system.

The continuous monitoring, although stated in the steps in the queue and ride process (shown
in the set task brief and listed) may not have been interpreted well by all learners. As a result
there have been some common responses.

These were:

1. learners producing a counting system, then a timing system as a solution, which met
parts of the brief
2. learners producing a counting system with a timing system running at the same time
which met most, or all of the brief depending on the other features learners
demonstrated.

It is important that learners study the brief in detail to ensure that they fully understand
the client needs.

For the task, learners had to design and test a prototype monitoring system to count the
number of people in a queue as they went through a turnstile for a ride at a theme park. The
amount of people in the queue would determine the wait time before they could go through
another turnstile into the ride area. The system required a counting system to count how
many people were in the queue. This would than indicate a waiting time. There should also be
indicators to show if the second turnstile is open or closed.

The majority of learners completed the tasks well. The examiners were able to award a full
range of marks for each of the activities and across the task as a whole. Most learners
followed the instructions and saved the electronic workbook and audio/visual file as
required. However, some of the electronic files were not saved as pdfs, with some text
boxes not visible and requiring moving into the page view to see the evidence. It was also
seen that some learners are including screen shot images taken from the sample marked
learner work and including this as their own. This is malpractice and plagiarism and where
seen, marks awarded reflected this and centres will be contacted to make them aware of
the malpractice. Centres are reminded that the sample marked learner work is for
assessment guidance and should not be issued to learners as a resource.
In presenting evidence for Activity 6, some centres exceeded the three minute limit for the
audio/visual recording and others provided as many as seven audio/visual recordings rather
than a single recording. Where this was evident, learners tended to describe the program in
detail and spent a lot of time doing so, leaving insufficient time to describe the system in
operation.

Centres are reminded to submit the electronic workbook in pdf format and limit the
audio/visual recording to a single file of no longer than three minutes and that
prompting/questioning of the learners during the recording is not allowed.

The written content provided by learners varied with some clear and well organised
responses, showing structure to the responses with subtitles for some activities. For
example, Activity 2, where some learners clearly referenced the requirements of the brief
and interpreted this in order to produce a detailed specification.

In activity 3, learners identified the input and output devices but often overlooked the link
to the microcontroller. There was some clear justifications for the use of some of the
input and output devices, with appropriate sub headings for these.

The program development varied in quality across the learners. However, most provided
a program that worked and annotated this to explain how it worked. Centres are
reminded to that the use of programming languages, like Scratch and Picaxe Blockly, are not
acceptable for this unit.

Audio-visual recordings of the systems in operation showed a generally successful


outcome. As in the previous series there was evidence of learners building simple jigs to
demonstrate the system. These were useful in demonstration but could use valuable time
that could be spent on other parts of the task. There are no marks awarded for this type
of feature, so learners should be made aware of this and not spend too much time in
building these. Similarly, some centres provided a rig for the learners. This could have a
negative impact on the learner as the option to choose input and output devices may not
be available if the rig is purpose built with solenoids, switches, LEDs etc.

In most of the learner work, suitable responses were seen across all activities, with an
improvement in Activity 1 compared to the previous series. In addition, those learners
that were not able to demonstrate the system in operating, (Activity 6), were still able to
provide some description of how it intended to work and managed to pick up marks
across the other activities. This is an improvement on the last series where a non -
functioning system tended to reflect poor performance across the other activities.
Learners’ responses to all of the activities that make up the whole task are considered in
the next parts of this report.
Activity 1- Task Planning and system design changes

This activity is designed to test the learner’s ability to plan in advance of each session and to
review/justify the changes made during Activities 2 to 6, in order to fulfil the requirements of
the Part S Client brief. The assessment focus is to ‘Carry out an iterative development process’.

The record sheet should be completed during each session, describing what was done,
highlighting any difficulties/problems and how these were overcome. Particular emphasis is
placed on justifying the solution to any problems here. At the end of each session, the learner
should identify the tasks for the next session and plan the order/priority of these. Some
learners copied entire activities in to the log book. This is not required as the record is an
individual activity. Learners can, however, copy extracts from other activities, for example,
program development to show error codes and how they overcame these.

Many learners (including those of a higher ability) seemed to interpret this activity as simply
requiring a generic time plan and diary/reflective log, which mainly resulted in marks from
Band 1. Some of these tended to copy large section of the client brief and link these to the
activities. For example:
To gain higher marks, learners should (please refer to the Activity 1 marking grid):

• Provide a more detailed outline time plan that refers to the problem (a monitoring
system for a refrigerated storage area in this case).

• Describe in detail what they did each session. This needs to be more detailed than ‘I
completed activity 2 and then started activity 3.

• Show clear planning with prioritised tasks for the next activity

Extract 1

Extract 1 shows a learner record for Activity 4.

The learner has gone beyond a simple record of what was done here. Each stage of the session
has been identified. There is evidence of context in developing the program so that it
addresses the brief. The learner is also thinking ahead and testing parts of the program as it
develops and is aware that screen shots will be useful when the program is annotated.
Extract 2

Extract 2 is an example of the issues and solutions for the session. There is evidence of what
issues were met, such as the problem with the buzzer and how this was overcome with
justifications supporting the solutions/fix at this stage. The learner has shown that through trail
and improvement of the development stages of the program that the process is iterative here
as the higher sound value mentioned and the 500ms time delay solved the issues identified.

Extract 3

The action points for the next session are well planned in Extract 3 (above). The learner has
identified the main tasks, in this case related to Activity 4.

The learner has previously worked on turnstile 1 and is now developing the program for
turnstile 2. This has been extended to making the ride start automatically and having the
emergency stop facility. The learner is considering reviewing the program to make it more
efficient- using macros to tidy up some parts. The learner is also planning to test the system
virtually before it is downloaded to the hardware for system tests.

The type of response shown in these three extracts would be representative of Band 3
evidence.
Most learners produced evidence in the intended way, completing the logs as they progressed
and maintaining the records in Activity 1. A small number of learners copied the log into each
activity making it difficult to mark as one task. For future series, learners should be reminded to
keep the logs together within Activity 1.

Activity 2 – Analysis of the Brief

In this activity, learners need to interpret the client brief into operational requirements and
prepare a technical specification for a user-friendly system that can handle some unexpected
events. Learners also need to consider how the system will be checked so need to prepare a
test plan to check the functionality of the final solution against the technical specification and
include some unexpected events.

In both stages of this, unexpected events are important. Learners that do not consider these
will restrict the marks that they can achieve to at best Band 2.

Some learners simply repeated the client brief as the technical specification and the test plan
was generic, with simple tests that would confirm that an output device, for example an LED
worked. Quite often the test conditions were vague here or stated that the program would test
it.
Extract from an initial test plan

To achieve high marks in this activity, the client brief should be interpreted well and a detailed
specification produced. The test plan should cover a range that will be used during the
development to confirm the system is performing as intended. The specification should
consider enhanced user experience and the test plan should cover unexpected events in order
to achieve Band 3 marks.

Extract 1
Extract 1 shows a good understanding of the problem. The learner has started off by repeating
some of the parts of the brief, but then expanded these to show a good interpretation of the
brief and considering the operational requirements for the system. The brief is met in full and
there is evidence of enhanced user experience, for example, showing the waiting time on an
LCD display and considering time for the riders to enter and leave the. The learner has also
considered the need for the system to constantly monitor people entering the queuing area.
The learner has also considered unexpected events such as too many people in the ride area
and someone leaving the queue do to impatience.

Extract 2

The test plan in Extract 2 shows a range of tests on the system. For the first three, the learner
has identified tests that are in the context referring to the turnstiles and queue time. The test
will use a button pressed to indicate a person is going through the turnstile and the
expectation is that the wait time will be shown on the LCD based on the number of people that
are in the queuing area.

There is evidence of the learner considering an unexpected event with tests to check the
system can handle this. The pause function is a requirement of the brief and is considered
here, with an enhanced information feature using the LCD display. The final test is one which
was overlooked by many learners. The system would be unsuitable if it had to be restarted or
reset after each ride so it needs to be monitored constantly without the need for a manual re-
start of the whole process.
Activity 3 – System Design

In this activity, learners need to provide evidence that shows they are considering the input and
output devices for the system. To achieve high marks this should go beyond simple
identifications or lists that will restrict marks to the lower bands. The input and output devices
should be appropriate for the operational requirements with justifications for the selection
reflecting high band marks. Learners should describe the function of the input and output
devices. This should also include the microcontroller connections. Many learners overlooked
this and simply attached a screen shot of a microprocessor without any useful explanation of
its use. For some learners there was a brief indication of the connections such as e.g. the
switch is connected to port B0. This resulted in marks from Band 1. For example;
This is the layout I am using for this system. 2 push to make
buttons going into D0 and D1 then one normal switch into D2. I
then have an LCD in Q0 and 1 and then a speaker in Q2

Throughout this activity, learners should use technical terminology and industry standard
conventions within the descriptions. Finally and often overlooked, there should be evidence of
the program design. At this stage it should be an outline of the program structure. It should
not be the developed program, as that is the Activity 4 task.

The form of the program structure could be a flowchart or pseudocode with some annotation
for the key features. The extract above has no evidence that the program design has been
considered for the activity.

Extract 1
Extract 1 shows part of a list of input and output devices for a learner. This link to the user
requirements, for example the push to make switch describes what it will do and its role in the
system. Following the list of components, the learner has justified why they were selected.

Extract 2

Above, extract 2, shows part of the microcontroller connections from a learner work. This
shows the connections in context, with reference to the waiting time and shows the output to
green and yellow and red LEDs and input via a toggle switch.
Extract 3

Extract 3 shows a learner producing a program design using pseudocode. The learner has
thought about the key stages required and has produced some annotation to support the
code. It is a well-designed layout and the stages are logical. This gives the learner a good
starting point for the next activity. There is an unexpected event (people leaving the queuing
area) that has been considered by this learner which is required to access the higher marks for
the activity.
Activity 4- System assembly and programming.

This activity is the main development task undertaken by the learners. The program should be
developed from the outline in the previous activity to produce a solution that meets the needs
of the user. Learners mainly used flowcharts although programming language was also seen.
Any changes or developments here show that the learner is using an iterative approach to the
problem.

The programs often overlooked the requirement that the system had to monitor people joining
the queue at any time. This meant that Turnstile 1 should be able to count whilst the ride is in
motion.

Some of the programs were basic and consisted of a countdown timer started manually when a
count of six people was met. Others like the one below comprise of simple but appropriate
constructs that have been used correctly although this could be made more efficient with more
subroutines to reduce the length of the program. This program was essentially a timing system
that was started manually. This was demonstrated when the audio- visual recording was seen.

For many learners there were good examples of developing the program to produce working
systems. Learners achieving higher marks for this activity produced programs were well
constructed and that had the facility to deal with unexpected events. The programs were
organised, well annotated and easy to follow. This would make it easy to make changes- even
by another person.
Extract 1

Extract 1 shows part of a program for a learner. The image is supported by detailed annotation
tom describe what the program is doing. In this image, it shows that the program can monitor
people entering through turnstile 1 and going through turnstile 2 to join the ride. It also counts
anyone entering turnstile 1 while the ride is in motion which is exactly what the brief required.
Extract 2

Extract 2 shows another part of the program, with the addition of macro commands to
make the program more efficient. Again, this is well supported by useful annotation that
references each part of the program.

Note: It is not necessary to annotate every stage of the program like this learner has done.
Extract 3

Extract 3 shows a program produced using a language method. The learner has supported the
code by annotating the key areas. It is well explained showing how the people entering through
the sensor affects the total as this will count +1 and show this on the display.

Activity 5- System testing and results analysis


In this activity, learners should update the initial test plan to show the actual results from the
tests. These should be used to check the outcome against the client brief. For example the
requirements of the waiting time to increase as more people enter the queuing area.

The quality of evidence varied here, depending on the initial test plan and how well the tests
and the outcomes were cried out and documented. Some learners did not update the test plan
and lost valuable marks.

For others, simple entries were seen that showed some test results such as download success
or LEDs working but did not evaluate the outcomes against the client brief. For example;
It is important at this stage to cross reference/check the test outcomes with the needs of the
client and provide a summary of this within the activity. High marks can be achieved by
ensuring the testing is structured and includes some unexpected events.
The extract below shows part of a Mark Band 3 response for this activity.

The learner has carried out structured testing in context, for example, showing that the
turnstile test increased the number of people in the queue by 1 for each press of the button
and has provided an evaluation of each against the client brief. Evaluating the system against
the client brief should be clearly evident and supported by the test results in order to get high
marks here. Part of this learner evaluation is shown below relating to the indicators to show if
turnstile 2 is locked or unlocked and also describes the continuous monitoring allowing people
to be added to the queue at any time.
Activity 6- System in operation

This activity requires learners to demonstrate the system in operation and make an audio-
visual recording of no longer than three minutes long. The recoding should be supported by a
commentary to explain what is happening at the key stages and use appropriate technical
terminology during this. It must be emphasised that even if the system does not work, or partly
works, marks can still be achieved as the learner is awarded marks for the commentary and
technical terminology. For example, if the learner had a non-working system and explained this
at the start or the recording (or it was evident from viewing) examiners will still be awarding
marks for the commentary explaining how the system works and the technical terminology
used. The quality of evidence for this activity varied considerably across learners; some failing
to show a working system, some showing a part working system and many showing a fully
operational system.

Learners should be reminded that the recording should show the system in operation, i.e. the
hardware showing how the inputs are operated and the outputs from this such as LEDs, buzzer
etc. Some learner recordings showed the program with a commentary of this. This included the
downloading of the program to the microcontroller. However, they did not demonstrate the
system working and marks were lost here. An example of this is shown as a screen shot image
from one learner;
This learner has provided a commentary based on the program only. This was explained using
appropriate technical terms and explained how the program worked. However, the learner did
not show the system in operation. As a result marks were lost here limiting the learner to Mark
Band 1.

In order to achieve high marks for this activity, learners should produce a fully functioning
system that shows consideration of the user experience. There should be some handling of
unexpected events demonstrated within the recording. Learners should not overlook the role
of the program and how it is working with the system but do not need to demonstrate this on
its own. Best marks are obtained when learners refer to what the program is doing and how
the system is responding at each stage. Learners can use prepared notes to help them through
the commentary but should not rely on these as the sole resource. The best commentaries
were seen from learners that has some bullet point notes and described the system as it went
through the various stages. Finally, avoiding generic terms is beneficial here so learners should
be encouraged to use accurate terminology throughout the recording to demonstrate their
understanding further.

Extract 1

The audio visual file attached shows a successful outcome for Activity 6.

This demonstrates a good understanding of the system, with appropriate technical terminology
and references to the program are made throughout the commentary. The commentary would
have been better if the learner had demonstrated a clearer understanding of the relationship
between the hardware and the program.

The learner shows how people entering through the first turnstile are minored, with a time
being related to this and displayed on a screen. The system is fully functioning with indicators
linked to the monitoring of the turnstiles and the time to wait being changed when people
enter the ride area. The system also allows people to leave the queuing area as an unexpected
event and has the emergency stop function as required in the brief. The learner has set a 45
second time to allow people to enter/leave the ride and the 30 second ride time. The system
times the ride while people are entering or leaving the ride showing there is continuous
monitoring of the system.

The audio-visual recording was awarded Mark Band 4 meeting all the requirements for the
activity. The audio-visual recording is located in the zip file.

You might also like