Insight: Frequently Asked Questions (FAQs)

How does Insight support Curriculum for Excellence (CfE)?

Insight aims to support the key principles of CfE by helping local authorities and schools to focus on raising attainment for all groups of learners and understanding and reducing the gap between higher and lower attainers.

Insight's key benchmarking features measure attainment at the point of exit from school, reflecting a key ambition of Curriculum for Excellence, in which schools are encouraged to consider the best and most flexible progression routes for their pupils within the senior phase.

Who has been involved in developing Insight?

Insight has been developed by the Scottish Government in partnership with Education Scotland, Scottish Qualifications Authority (SQA), Association of Directors of Education Scotland (ADES), Educational Institute of Scotland (EIS) and School Leaders Scotland (SLS).

Secondary school and local authority representatives have been involved at various stages of the development process and were able to provide feedback on all of the prototypes. This has continued since the release of the live version and will be an ongoing feature of its development.

How does Insight account for those learners who sit exams early?

Insight includes only senior phase qualifications and this may affect trends for schools which previously presented pupils up to the end of S3. However, given that Insight uses 'best' data, pupils who go on to gain higher-level awards in the senior phase will in effect 'cancel out' the attainment gained in the Broad General Education; nevertheless, a pupil who gains awards in S3 and takes specific courses no further will not have this attainment recorded within Insight. However, this can be captured in other ways e.g. through pupils' S3 profiles.

What about other sources of achievement?

It is important to continue to value all learning and outcomes gained. Pupil profiles are also important sources for recording and recognising achievement and information will also be collected by schools via their standards and quality reports. It is important that the needs of the learner are the primary consideration when selecting courses or programmes to enter and that a school does not make decisions on its curriculum based on the data shown in Insight alone. This could mean in some cases providing courses and programmes which are not included in Insight.

How are different CfE curriculum approaches taken into account?

The Insight tariff measures are designed to take account of different curriculum models under CfE by measuring 'best' attainment at point of leaving school. For example, where a young person gains National 5 in English and goes on to gain a Higher in English, only the 'best' award, i.e. the Higher, will be counted in the national benchmarking measures.

The tariff scale aims to ensure that appropriate credit is given for pupils' awards and qualifications, taking into account the level (under the SCQF) and the varied means by which pupils study (for example Units, whether they stand alone or become parts of courses). For more information on the Insight Tariff see: Insight Tariff Note and Deep Dive 3: Guide to Insight Tariff

The breadth and depth measure counts awards based on the 'best' achievements of leavers, and a school's curriculum model will have an impact on this measure.

Which courses feed into which curricular area?

The Other Local Measures section of the technical guide provides more information on this.

There have been changes in the system through Curriculum for Excellence. Can I really make comparisons over time?

Where possible the tool has been designed to allow comparisons over time, for example by aligning old and new qualifications in the tariff scale. However, it is impossible to ensure that all trends are fully comparable and it is important to consider your own professional judgement and local knowledge as well as the data presented. In particular:

How do I access Insight?

Local authorities and secondary schools can access Insight using the link below:

Schools and local authorities can access Insight using their SEEMiS or Glow credentials in order to save the majority of users having to remember a separate login.

What happens if my school does not use SEEMiS?

If your school is not on SEEMiS you can still use a Glow login. Otherwise, you will need to be set up with a ScotXed Insight account to access the tool. If this is the case please alert your headteacher who will contact the relevant person in your local authority who can then request this.

I can't remember my SEEMiS credentials: what should I do?

Please contact SEEMiS or your local SEEMiS support team if you have any problems logging in with your credentials.

I can't remember my Glow credentials: what should I do?

Please contact the Glow team or your local Glow support team if you have any problems logging in with your credentials.

How do I configure my SEEMiS account for Insight access?

In order to access Insight using SEEMiS, your SEEMiS account must be properly configured. You also need to use the username (login) which you use for SEEMiS web applications. In some cases this may be different from the one you use for Click & Go. For advice on these issues please contact your local SEEMiS support team.

The following advice may assist SEEMiS support teams in configuring accounts:

What data can I access for other local authorities and schools?

From September 2017, an open access model has been in place. The intention behind this approach is to extend the previous access arrangements to allow all users to view data for all establishments without suppression or controls. However, it should be noted that Insight policy is to suppress data for values of less than five. Basic disclosure controls will therefore continue to apply when a school or local authority level user is looking at data for another establishment.

Because of these suppression arrangements, in almost all instances graded course data will be suppressed for at least one of the grade bands. That being the case, further suppression is required to prevent calculation of the supressed value. Because of this, the current access model (access to only your own establishment's data, or any learning partners) for some of the course-based measures including:

On this basis schools will continue to see only their own data for these measures and local authority level users will be able to view all data for schools within their own local authority. It is worth noting that where a learning partnership has been entered into between schools, a user will continue to see data for the partner establishment without any suppression or controls.

Do independent schools and special schools have access to Insight?

Special schools and independent schools are not currently included in Insight. However, we will continue to investigate the possibilities around their inclusion as part of the on-going development programme, and engaging with relevant stakeholders to understand the different characteristics and requirements of these schools that would need to be taken into account for any extension of the tool's scope to be of value. Furthermore, Insight draws on data that is not collected for special schools and independent schools, with consequent needs to investigate the differing technical issues and possibilities.

Are colleges included?

Attainment information for pupils on the roll of a school and participating in school-college partnerships is included in Insight in the base school's local and national dashboard measures (literacy and numeracy, attainment for all and attainment v deprivation) and in breadth and depth measures.

Can Parent Councils and parents be given access to Insight?

Insight is a tool for professional reflection and self-evaluation, and is therefore not accessible to parents, pupils and the wider public. Information aimed at parents will be found on the Scottish Government's School Information Dashboard, and will be updated annually.

How is Insight different to STACs?

Insight replaced STACs (Standard Tables and Charts) in September 2014 and has a number of enhancements including:

Can I access STACs data for 2014 onwards?

No. STACs was last published in 2013 and has not been calculated since then and is no longer available online.

STACs was based on the course entries expected to be completed within an academic year; however, Insight is based on the result date of learners' qualifications to allow for a wider range of awards to be included. In some cases, this may lead to attainment in Insight being reported in a different academic year than had been the case in STACS or by SQA. There will also be differences between the total number of 'resulted entries' described by Insight and the total number of 'entries' described previously in STACs. However, figures on actual attainment (i.e. an award at A to D in National 5 or an award at National 4) should be similar in all sources.

In both the tariff scale and the breadth and depth measure, Standard Grades are treated as equivalent to the following national qualification grades:

Standard Grade National Grade
1 National 5 A
2 National 5 C
3 National 4
4 National 4
5 National 3 **
6 National 3 **
7 N/A - Standard Grade 7 is not SCQF levelled

** Note that Standard Grade 5 and 6 receive 18 SCQF credit points whilst a National 3 receives 24 credit points and this therefore translates into differences in tariff points for these qualifications.

Will Insight include a 'pass rate' for graded awards such as Higher?

Insight will not provide a 'pass rate' for Higher, or other awards. However, the tool will recognise achievement of this award through the course based measures and through the tariff. The breadth and depth measure will also provide information on number of awards achieved by SCQF level.

How is Grade 'D' treated in Insight?

The 'graded course' output shows Grades at A to D as well as no awards. Candidates gaining a D will be shown as such on this measure

National 5 courses are graded A to D or 'No Award'. Grade D is not a pass, but is reported on the Scottish Qualifications Certificate as a course award in its own right. Grade D indicates the candidate has achieved a band 7 in the external assessment.

Where is partnership attainment captured within Insight?

Awards sat at partnership centres will be included in the national and local dashboards and the breadth and depth measures when you are viewing Insight as the base centre. In the other local measures (graded and ungraded course measures and curricular area measures), this attainment will show for the presenting school. This process ensures that, at a school level, the school gets credit for doing what is best for the learner but at a course level, the data reflects the performance of that course in that school.

It is possible to view partnership attainment at a course level in Insight, using the Partnership Course Summary. This gives more visibility to courses attained via non-school presenting centres and provides the base centre with visibility of their learners' achievements at other establishments.

How can I ensure that my pupils' attainment which is undertaken through a partnership agreement in other centres is captured in Insight?

The process of linking candidate attainment in other centres back to their base schools relies on the following criteria being met:

Some of my pupils sat, but did not receive, National 5. We have since entered them for Recognising Positive Achievement at National 4. Where will this show in the tool?

Candidates who were entered for and achieved the National 4 Added Value Unit (and Literacy / Numeracy Unit where applicable) prior to August certification in 2019 will have been awarded the National 4 Course on their August certificate provided that they received a no award in the national 5 course assessment, attempted all or part of the course assessment and passed all of the internally-assessed units as part of the National 5 course. These National 4 results are counted with the 2019 attainment data and would have appeared in the tool from September 2019.
Where such candidates had not been entered prior to August certification in 2019, an automatic course entry at National 4 is created by SQA. This will have appeared as an incomplete entry in Insight in September 2019. Where that entry was converted into a National 4 award (through achieving the added value unit and literacy / numeracy unit where appropriate) by the start of December 2019, that result would have appeared in the tool in February 2020 and would be recorded as 2019 attainment.
Where such candidates achieve the added value unit and literacy / numeracy unit (if appropriate) after early December 2019 and by April 2020, that result will appear in the tool in September 2020 and will be recorded as 2020 attainment.
The same is true for subsequent years' attainment.
This is in line with SQAs own statistical reporting.
Further information on Recognising Positive Achievement can be found on the SQA website at:

How does Insight handle incomplete awards?

Incomplete awards with a result date in the relevant academic year are included in the tool. In the course measures, they contribute to the 'no award' and 'number of resulted entries' figures. There are likely to be higher volumes of incomplete awards seen in Insight in the September edition of the tool, reflecting the fact that it includes pre-review data. By the February edition, the volume of incompletes is reduced in the post-review data because of the late submission of results and processing of some National 4 automatic entries for recognising positive achievement.

How can I look at five years of data for all measures?

You can choose to see five years of data by changing your preferences from the user menu. However, when the summer update is released with the latest attainment data, corresponding leavers' data is not yet available. Between the summer update and following spring update, leavers' measures will only show four years of data.

When I apply filters in the Whole School Course Summary, I don't see the data I expect to. Why is this?

The Whole School Course Summary (WSCS) draws on and presents an overview of detailed data provided by a number of other measures within Insight. If the requested data is not already calculated no data is displayed.

Insight calculates much of its data in advance so that it can be provided quickly and is available to measures such as WSCS. It does this for 'core cohorts' which include whole school, stage and sex cohorts. However, it isn't practical to 'pre-calculate' data for all possible filter combinations. Insight overcomes this by calculating data on demand for 'non-core' cohorts (i.e. those cohorts filtered on factors other than stage and sex). When it does this, it stores the results in case they're needed in future. The WSCS measure displays both pre-calculated data and any available calculate on demand data

When selecting a filter, no data is displayed. Why is this?

There is a minimum cohort size defined for calculation of data for all measures. This is set at a level that tries to ensure meaningful information is produced. For example, this might be set to 10 for a measure and would be why no data is calculated when you select a cohort with only 9 individuals in it.

A course is not appearing in the Local Course Measure: Course Comparator ▶ Selected Grade Course. Why is this?

In order for a course to have a Course Comparator value calculated for a particular group of pupils we require that at the national level more than 60 pupils take the course and have other course attainment within that academic year, and that the national regression provides a reasonable fit to the data. If the national relationship does not provide a good prediction of attainment then we are unable to provide a reliable measure of whether pupils in a particular school are doing better or worse than predicted. This means that a Course Comparator value is only ever provided for a subset of the graded courses where we can be confident in the predictions of our underlying model and therefore in reporting whether your pupils are doing better or worse than could be expected. Courses excluded from the Course Comparator for the above reasons will still show in the 'whole school course summary' as in this case it is simply presenting the attainment from your pupils.

How is wider achievement included in Insight?

Wider achievement is captured in Insight and a list of the current awards included can be found within Deep Dive 2 : Recognising Achievement. It is our intention to increase the number of wider awards in the tool. All awards need to meet our criteria: they must be on the SCQF; fit with Curriculum for Excellence principles and the awards must be able to meet our technical requirements.

How is the Insight Leavers' Cohort defined?

A school leaver is classed as a young person in the senior phase (S4-S6) who left school during the school year. The school year is taken to run from the school census day one particular year to the day before census the following year. For example, for 2017/18 leavers, the leaver year is 20th September 2017 to 18th September 2018. A school leaver will be on the roll of a school in one year but not the next, as identified in the school census.

Identified school leavers are contacted by SDS staff during the month of September in order to establish the nature of the destination being undertaken, if any. While SDS do record additional information provided by the learner about the destination (e.g. course level and title, job title etc. ), only the category of the destination is provided to Insight.

The Initial destinations data received from SDS is compared to the pupil census information to ensure that there is a leaver record in both datasets. Pupils who have left school (according to the census) but do not have an SDS destination record (if they've moved abroad, for example) are excluded from Insight National measures.

How are winter leavers considered in Insight?

Young people in S5 who have a leaving date on or before 31st December are classed as Winter Leavers. As they were on the school roll at the time of the September census, Insight will not recognise them as leavers in the current session. They will not be included in the subsequent February Insight update, but will be included in the following year (i.e. a pupil who left school in December 2018 will not be included in the February 2019 update, but will be in the February 2020 update).

To ensure that S5 winter leavers are matched to other pupils who have had similar opportunities to achieve National Qualifications, the virtual comparator methodology:

Why is this cohort slightly different to that published by SDS?

Within Insight, the concept of a leaver is not just important for the reporting of destinations. It is also the basis on which other key attainment statistics are presented, such as all four of the benchmarking measures on the National Dashboard, including school leavers' literacy and numeracy, attainment and also attainment in the context of deprivation.

To make this possible, the Pupil Census is the primary dataset on which the rest of the tool is based. It is necessary for identifying the characteristics of pupils (for filters) and for linking to attainment. The SDS data must therefore be successfully matched to the Pupil Census. Without such a match it is not possible to identify the characteristics and attainment of the pupil, and so they are not included in the Insight leavers' cohort.

In addition, Insight is designed to focus on the senior phase (S4-S6) of publicly funded secondary schools. SDS published data takes account of leavers from all stages (i.e. including S3 and adult leavers) at publicly funded schools.

The flow chart below illustrates how the Insight leavers' cohort is defined (using 2013-14 and 2014-15 as an example) and how it differs from the SDS data. Green borders indicate leavers who are included in the Insight definition whilst red indicate leavers who are not.

Following changes to Skills Development Scotland's (SDS) processes for recording destinations of 16-19 year olds in Scotland, no cohorts have been back updated for the February 2017 update of the tool or will be in future. From September 2016, the SDS school leaver cohort is based on leavers who left school between the third Wednesday of September (Pupil Census day) and the third Tuesday of September the following year. Further information can be found in the leavers FAQ section.

What is meant by the tariff scale and tariff scores?

One of the ways in which Insight analyses performance is through the use of a tariff scale. This tariff scale has been developed specifically for Insight.

Tariff scores provide a way of comparing attainment between schools, local authorities and the virtual comparator and the ability to look at a breakdown of cohorts within the senior phase. In addition, tariff scores allow us to look at different types of achievement and awards from a range of providers.

How should the tariff scale be used?

The tariff scale allows Insight to provide overall summary measures of attainment for schools and local authorities although it is only one part of the national dashboard. The tariff measures on the national and local dashboards present information on average total tariff as a default. However, where an establishment wants to interrogate this data further, they can access an average complementary tariff measure by means of a filter. It is helpful to consider performance on tariff measures alongside the others (leaver destinations and attainment of literacy and numeracy) to get a clear and rounded picture. The tariff measures allow you to compare trends over time for our own school and also to compare to a virtual comparator. Comments are provided if there is a significant difference between a school and its VC but, like other measures, you will also wish to use the filters and your own local knowledge to interpret the results fully. We have made available a detailed note on the tariff methodology.

How is the tariff review progressing?

The review of the Insight Tariff has now been completed. On the advice of the Tariff Review Group, the Insight Project Board approved the strategic principles of the tariff, the introduction in the September 2018 update - on a pilot basis - of the Grade Neutral Tariff (GNT) and further consideration of how wider awards could be better recognised within the education system. They also accepted the vital importance of improved communication on Insight and how it can best be used and improved links with other aspects of education and learning, in the light of the Learner Journey report and other developments.

What is the complementary tariff measure?

During the 2015 tariff health check, feedback suggested that some users perceived the total tariff to favour more standard pre-Curriculum for Excellence approaches while some (fewer) perceived that it might encourage a focus on the most able pupils. However, it was also recognised that there is a risk of intervening prematurely if the tariff scale itself were modified at this point, whilst CfE and new qualifications are still becoming embedded.

We also heard from users about a desire for more information to support schools and local authorities to better understand the impact of different curriculum design. Based on work carried out by the Statistics Working Group, and presented to and approved by the Project Board, we implemented a complementary tariff measure as a pilot in the February 2016 update to the tool.

The 2016 tariff health check found that, in the vast majority of cases, the messages from the total and complementary tariff were the same or similar. This is in line with expectations. Where differences did occur, they were most common for the cohort with the highest 20% of attainment. These differences could be explained by differences in the number of qualifications students took. Therefore, the complementary tariff is working as expected and provides schools and local authorities with additional evidence of performance to inform their evaluations and reflections. Used in combination with the existing tariff, the complementary measure will support equity of measurement regardless of establishment or curricular approach.

Deep Dive 3: Guide to Tariff provides more information on how to use the complementary tariff:

Does the tariff score relate to UCAS points at all?

There is no relationship between the tariff points used in the tool, which are used for benchmarking purposes, and the points awarded by UCAS to inform university entrance.

Why is the virtual comparator the key benchmark?

Insight allows schools to compare their performance to the performance of a virtual comparator, which is made up of pupils from schools in other local authorities who have similar characteristics to the pupils in your school. This is helpful because it allows a comparison based on pupils who are like yours on key variables (sex, deprivation, additional support needs and stage of leaving / latest stage) that are linked to educational outcomes rather than comparison with real schools which may have quite a different pupil profile. The virtual comparator therefore controls, to a large extent, for the background characteristics of pupils in your school and offers a fairer comparison.

When I look at the information for my school compared to its virtual comparator (VC) what should I be interested in?

When comparing your school to its virtual comparator you will be interested in whether the value for the measure for your school is above or below its VC. As with any model, the measure value for the virtual comparator will have a margin of error and it is therefore also important to consider whether there is a commentary for the comparison. If there is, then the comparison is both statistically significant and considered a big enough significant difference to be of interest educationally. You might then like to look at whether the trend over time indicates that you are consistently performing above / below the virtual comparator or whether this has changed.

The virtual comparator is provided as a benchmark to allow schools to compare performance against pupils with a similar background to their own. However, the absolute value for the virtual comparator on a measure is perhaps less of interest than:

We would certainly encourage schools to explore all of these areas, together with analysing their own performance over the three or five year period: has the school's performance improved? Is there a trend?

From a validity perspective, you will also want to consider the number of pupils involved in the comparison. For example, if you have applied a range of filters you might find that actually you are looking at a very small number of pupils and therefore the comparison may be less useful, given that small numbers are more prone to fluctuation.

It is important to stress that the virtual comparator is a way of asking a question: 'how does the performance of young people in my school compare to those of young people with a similar background to ours from schools all around Scotland?'. It does not suggest that this performance is excellent, very good or even poor.

Is the virtual comparator always a good benchmark? Are there circumstances where I should have caution in interpreting it?

The virtual comparator is designed to offer a comparison for your school taking account of key background characteristics (sex, deprivation, additional support needs and stage of leaving / stage) of your pupils. Therefore it is extremely valuable for interpreting your school's performance and improvement on the dashboard measures and other measures which consider achievements for the whole leavers' cohort or whole senior phase. The leavers' cohort is the most important since it is the most consistent point of comparison given the various routes and pathways available to schools for using with their young people as they progress through the senior phase.

However, there are some particular circumstances where you should exercise caution in interpreting the data and take account of your own local knowledge and context. These are:

Why can't the virtual comparator address these circumstances?

The virtual comparator is constructed based on four variables which were highly statistically significant. The number of variables included in the methodology was found to strike a reasonable balance between matching a pupil appropriately and not having so many variables that matching some individuals was hard to achieve. It also balances the need to account for factors which influence educational outcomes whilst telling us how well we are serving particular groups. For these reasons the virtual comparator could never include all of the variables which are available as filters in the tool.

A health check on the was conducted on the virtual comparator in 2016. The results of this health check were reassuring and showed that the messages provided to schools remained consistent when the virtual comparators were reselected multiple times (on the same characteristics). As CfE matures, curriculum models and presentation patterns continue to develop. Through the review process, data from the first CfE cohort will help to inform the development of the VC and ensure that it continues to provide a meaningful benchmark for schools.

Will the virtual comparator for my school ever change?

Yes. It is possible for changes in the virtual comparator to occur. This happens if there are changes in the underlying data which feeds into Insight and which is used to build the virtual comparator. The VC has been designed so that in most cases changes are small and the messages presented should not be affected.

The construction of the virtual comparator is based on selecting ten pupils at random to match each of the pupils in your school. If all of the pupils included in the tool remain the same then the pupils included in the virtual comparator will not change and the measure values for the virtual comparator will not change.

However, if there is a change to the number of pupils or to the characteristics of pupils included in the tool then this will lead to a change in the pupils included in the virtual comparator and to differences in virtual comparator measure values. Further information on using virtual comparators can be found in the technical guide.

Where can I find out more about the virtual comparator methodology?

Further details on the virtual comparator methodology can be found in the Technical Guide.

Have there been any changes to the underlying data feeding into virtual comparators in the latest update?

The February 2022 release of Insight reflects the most recent data about school leavers for 2020/2021. In previous Insight updates, past cohorts have been back updated if required to take account of the latest data from Skills Development Scotland (SDS).

Following recent changes to SDS's processes for recording destinations of 16-19 year olds in Scotland, no cohorts have been back updated for the February 2017 update of the tool. From September 2016, the SDS school leaver cohort will be based on leavers who left school between the third Wednesday of September (Pupil Census day) and the third Tuesday of September the following year. Further information can be found in the leavers FAQ section. Going forward, we expect updates and corrections to happen less frequently and consequently for there to be fewer occasions on which comparator values may change.

What has the project learned from the health check of the virtual comparator to date?

The implementation of the virtual comparator is designed such that, if there are no changes to the pupils or the underlying data, then the virtual comparators will be the same in every run of the tool. As part of our health checks of key methodologies in Insight, the team is continuing to undertake analysis to better understand the sensitivity of the virtual comparator and commentaries methodologies to change and to provide further guidance for users on this issue.

Sensitivity testing on the virtual comparator in 2016 provided reassurance that changes to the underlying data do not significantly impact on the messages for a school to VC comparison. The underlying data may change the absolute value of the difference between the two but does not tend to change its statistical significance. Therefore, as described in question 2, the absolute value for the virtual comparator on a measure is perhaps less of interest than the direction of the difference, the strength of the comparison and the trend over time.

The commentaries offered in the tool provide important information on the key messages from any comparison. During phase 2 of the project, we will be continuing to develop the availability of commentaries information, including exploring ways of making more information available for users on the statistical significance of comparisons.

Are commentaries provided for all measures?

No. Commentaries are provided for the key measures in the tool. However, commentaries are not currently provided for:

In addition there are some differences in the way commentaries are provided between measures, including:

Have there been any updates to the commentaries methodology?

Yes. We've made improvements which make the methodology more robust and should lead to more comments being displayed in Insight. These improvements may also mean a very small number of commentaries will change.

From February 2016, we have implemented a population-weighted standard deviation approach to determining whether differences are large enough to be of interest educationally. In addition, we have calculated this separately for comparisons where the cohort is performing above its VC and for comparisons where the cohort is performing below its VC. This helps to ensure that commentaries are not unduly driven by very small cohorts and to deal with situations where the distribution of these differences is not completely symmetrical.

Will the commentaries continue to be improved?

The commentaries currently provided are working well. They highlight comparisons that are statistically significant (i.e. where the difference between the school and its VC is unlikely to arise purely by chance) and which are large enough to be of interest educationally. We will, however, continue to health check and strengthen where appropriate, elements of the commentaries methodology.

How should I interpret the commentaries information?

Commentaries are provided to offer you an indication of key comparisons which you may wish to drill into in more detail. When a commentary is displayed for a comparison between a school and its virtual comparator this tells you that the difference is significant and that it is large enough to be considered of interest educationally. However, your own professional judgement, local knowledge and understanding of the context of your school should also be taken into account.

It is also important to be aware of the current limitations of the commentaries methodology and note in particular that:

Are the commentaries for my school fixed?

Commentaries are provided for comparisons between a school and its virtual comparator (and the national establishment for the deprivation measures). If there is a change to the number of pupils or the characteristics of pupils included in the tool then this will lead to a change in the pupils included in the virtual comparator and to differences in virtual comparator measure values. This can, in turn, lead to changes in the commentaries provided. The VC has been designed so that in most cases changes are small and the messages presented through commentaries are not greatly affected. Our current testing supports this and we will continue to monitor it through the ongoing health checks of the virtual comparator and commentaries methodologies.

How should I interpret the learning partners' ranking information?

The Learning Partner feature has been provided in order to allow schools to work more closely with other schools by sharing more detailed data. This is intended to support professional dialogue.

The way a school choses its learning partners might depend on their focus. To help schools find others they may wish to partner with, Insight provides a ranking of 'closeness' which is based on how schools compare to their virtual comparator on the 'attainment for all' measure. Since this is based on virtual comparators, there may be changes between versions of the tool if there are changes in the underlying virtual comparator data. Schools ranked closer to 1 have a relationship to their virtual comparator which is as close as for your school. However, this does not mean that schools ranked close to 1 are similar to your school in other aspects (e.g. deprivation profile).

In addition to the overall ranking, information is provided on how the centre compares to its VC in terms of its lowest 20%, middle 60% and highest 20% of attainers. This is provided as weighted normalised scores (about which more information can be found in the technical guide). For example, a negative weighted lowest 20% normalised score would indicate that the school is below its VC for this cohort. In this case, they might want to find a school within the list of learning partners which is doing much better than its VC for this group ? i.e. a school with a positive score. Or they may wish to partner with another school which is performing similarly to see how they can work together to improve this. If a school is performing quite well compared to its VC (a positive score) but they wish to partner with a school which is doing even better, then they could look for a school with a higher positive score.

It is important to remember that there will be other reasons that schools may choose to partner with each other and that the ranking is not intended as a definition of similarity. Further information on learning partners, including more detail on the methodology, can be found in the technical guide under : 'Other Local Measures'.

Should I be able to create and dissolve partnerships for my school under the Learning Partners feature?

Only one individual per school will be assigned 'Learning Partner Administrator' rights to enable them to create and dissolve partnerships via the Learning Partners feature. Schools need to contact us with the name and school email address of the individual who will act as their Insight administrator.

The tool provides information on senior phase pupils in publicly funded secondary schools in Scotland based on the Pupil Census. Which pupil statuses are included?

The Insight definition of base centre is aligned with Pupil Census definitions so that it will only include pupils with a status of:

This brings the methodology into line with other analysis such as national statistics leavers publications.

Insight therefore identifies a pupil's base centre as follows:

What happens if there is a school merger? How is this information presented in Insight?

Every year, we undertake an exercise to learn about the current status of schools in Scotland in advance of the Pupil and Teacher Census collections. At this time, local authorities make us aware of any new or closing schools and about any mergers which are taking place. This information is then fed into Insight.

When two (or more schools) close and merge into a new school, information is only displayed for the open new school in Insight. As was the case in STACs, all of the historic data for the closed centres is reported against the new centre to allow a time trend to continue to be shown.

This may affect Virtual Comparators.

Why are the differences in the number of looked after children in Insight compared to other sources (e.g. SEEMiS or Children's Social Work statistics)?

Insight takes the field 'StudentLookedAfter' from the Pupil Census. This is a snapshot of pupils recorded as being looked after on census day in September of that academic year. However, a pupil's LAC status could change over the year, e.g. they might have been looked after on census day but may not by the time they leave school. Insight does not illustrate these fluctuations across the academic year. For this reason, you may see differences in numbers of children who are looked after when comparing Insight to data held within SEEMiS in schools.

You may also see differences when comparing with looked after children data from Children's Social Work statistics since they originate from different MIS systems. If you are interested in the leavers destinations of looked after children, the most robust source remains the Scottish Government's publication on the Education outcomes for Scotland's looked after children

How do I put charts from Insight into other documents?

The simplest way to put an Insight chart into another document is to copy and paste it. In your browser, right click on the chart and select 'Copy' from the menu that appears. Then, go to your other document and paste it as you would normally. The chart will appear in the document as an image.

There are some caveats you should be aware of. Some software, such as Microsoft Office 2010 and later, tries to load the chart directly from Insight rather than just pasting as an image. This can mean no chart appears OR if you are copying a chart for an establishment other than your own may mean that the chart for your own establishment appears. You can work around this by using one of the following options:

How do I include filter selections in images I copy from Insight?

Copying and pasting from Insight doesn't bring across filter selections as part of the image. You can do this using the Print option available in Insight.

Select the print menu option. Instead of printing to a real printer, choose a virtual printer that produces an image such as Microsoft XPS Writer or Adobe PDF Printer (for help with these consult your local IT Support). Once printed as an image, the file can be inserted into other documents or sent by email etc. The image will include the chart and details of filter selections on the page.