Don’t Blame the Science, It’s not that Simple!

Paul Hayward published an interesting article on the application of science in sport. Paul made reference to a report “Bath University’s Department of Health, via its journal, Sport Education & Society. The report examines professional rugby clubs and “the extent to which players become risk averse in order to follow strict coaching instructions could threaten to fundamentally undermine aspects of the game”. The article via the Telegraph suggests sport science may be the cause of poor performances and says “across sport a change is bound to come: a counter-movement, away from the “data-driven” methods Bath University have identified and back towards freedom of thought, spontaneity and a more human outlook, with all its flaws.”

Personally I think the article is a great piece of journalism in the sense that Paul has taken a snapshot of the world of sport science and created curiosity around a topic that creates discussion and debate. But is it as simple as to blame the science for poor performance in sport or is there more to it?

The definition of science by Oxford Dictionaries “The intellectual and practical activity encompassing the systematic study of the structure and behaviour of the physical and natural world through observation and experiment:” If you close your eyes what image comes into your head? Do you see white lab coats, clip boards, equations? Well you couldn’t be further from the reality of sport science. For all the lab based testing and experiments there is much more applied work being undertaken to understand the training and competition environments.

Sport Science as with any science is evidence based and therefore requires practitioners to observe, collect evidence then test and experiment to prove theories. The objective is to identify a change that can improve or influence an outcome. And the one thing that underpins good science is it should be subject to challenge which is all that Paul has done in his article.

Science has changed the world of sport beyond recognition, and for me overwhelmingly for the better. The problem we have with sport science in general is we like it when it is linked to gold medals at the Olympics, league titles, world records etc., and we start to get all antsy about it when things just aren’t going our way. One of the issues is success breeds complacency, we become too trusting of things that contributed to previous successes and overload the process with more of the same in anticipation it will create further success. Well that’s science for you, adding more of the good stuff, will not necessarily result in a further improvement.

HRM & PA

More of the so called ‘good stuff’ may end up being too much of one or several ingredient/s that results in ‘paralysis by analysis’. As with many business environments sport science is becoming awash with untold volumes of data. The art is in the trickle and not the waterfall of informative reporting to those that it is valuable to. The main concern for me is not in the sport science, but if every sport scientist, coach, medic etc. had 10 or so KPI’s then I would challenge the idea that are all these really key. If they assess the athlete against so many KPIs’ the problem is not necessarily in the science but in fact they have too many KPI’s and/or too much data to focus the team on the important mutually beneficial KPI’s that have a real impact on results. Athletes focusing on too many KPI’s/data may risk him/her picking on ones that influence their personal decisions and actions and not on those that are mutual beneficial for the team.

To conclude I would say blaming the science is too simplistic and does not give credit or recognition to the opposition whom in most cases will be applying this sport science that as suggested is the root cause of some teams that perform poorly. Apart from totally bad practice in science which I doubt would be the case at elite level of sport the need to evaluate the process when results are not going our way is the important factor. Building a system that not only observe athlete behaviour and performance but checks that our interventions in coaching and science are meeting the objectives and if the process has become overloaded, inefficient, is having a negative influence or impact with athletes or has in some way become flawed is the starting point. And finally, pushing the science to one side, heaven forbid, we may also have to consider that maybe that our athletes are just not good enough to win.

REFERENCES:

Oxford Dictionaries. (2014). Science. Available: http://www.oxforddictionaries.com/definition/english/science. Last accessed 20/11/2014.

Paul Hayward. (2014). England need to escape analysis and structures and loosen up – sports science is strangling the life out of players. Available: http://www.telegraph.co.uk/sport/rugbyunion/international/england/11238149/England-need-to-escape-analysis-and-structures-and-loosen-up-sports-science-is-strangling-the-life-out-of-players.html. Last accessed 20/11/2014.

Bridging the PA Educational Gap.

In light of some growing conversations and research surrounding the value and perceptions of performance analysis and the applied practice tiered from elite to grass roots I feel there is a need to understand the educational pathways from a basic to broad underpinning to embed the discipline into your coaching practice. Prior to sharing my personal feelings I wish to declare I design and deliver a range of short courses in performance analysis and will share my experience of that environment although I will seek to be as objective in my view as possible.

To start, today I read the very interesting colloborative research summary from the Irish Institute of Sport (IIS) , Sports Institute of Northern Ireland (SINI), Coaching Ireland, the Institute of Technology Blanchardstown (ITB) that is a must read for anyone interested in the growth of shutterstock_117550120 (2)PA as a discipline and further reinforced many of my thoughts. It also came at a convenient time to coincide with my blog. From an educational viewpoint the research highlighted some key points:
• 78% of coaches who do not use PA said they would like training on how to integrate PA effectively into their coaching
• 86%, of coaches using PA said they would like training on how to integrate PA effectively into their coaching.
• 94% of coaches stated that they would like to use more PA within their coaching.

 
Performance analysis was once a discipline reserved for elite sport and has been underrepresented on the coach education front. While Cardiff Metropolitan University have been running a MSc in Performance Analysis at the Centre of Performance Analysis for over 10 years and more recently introduced the first BSc in Performance Analysis the coach and grass root practitioners have had little support in developing their skills and knowledge beyond software vendors delivering training heavily weighted towards the features of their specific products. I am not being critical of software vendors or undermining the value of their training as it is an extremely important component to the success of their business and the client receiving the support and training required to maximise the potential and power of the application.

 
We live in a world where the knowledge economy places great demands on many practices and shutterstock_147790112 (2)  it is extremely important that we strive to continually improve our own skills and knowledge. We also need to consider that many coaches working at the grass roots and development stages of sport have earned those positions through sacrifice and voluntary giving through a love of their particular sport. Furthermore, consider that it is unlikely a club at this level will have a budget to employ a performance analyst to support the coaching process.

 
While a degree is not a prerequisite and doesn’t equate to automatic success and riches if your ambition is to work at the elite level the pathway of higher education will maintain pole position in developing a broad underpinning and valuable research skills specific to the discipline making you more employable. On the opposite side of the educational pathway our short course in performance analysis is what I personally call a coach education program. The classroom very often has a blend of coaches, athletes, students and graduates and the benefits flow both directions. We offer an environment of facilitation to share practices, learn and explore with a rich flow of ideas very often derived from diverse sporting backgrounds and applied practices. Our participants not only get the opportunity to share practice but get to expand their networks with like-minded people.

 
So I often ask is this a discipline for the elite? Should we deprive those applying good practice through personal development the opportunity to learn about performance analysis so that it can be reserved for or only applied by those that learn through the portals of full-time, formal higher education? When considering a response to this question I think of my own employees and their personal development plans for the year. All my staff must be able to apply first aid and CPR, so do I send them to become a paramedic? Or do I evaluate the needs of the specific role and environment and send them on a course to gain the fundamental skills to fulfil the skills gaps?

 
As with first aid, it need not be an either/or, we need both. The higher skilled professionals exposed to the intellectual vigour of university education will as mentioned hold pole position on the elite job front, yet the fundamental practice and theory of the discipline can be applied at a more basic level through an effective coach education course. This enables those at the grass roots and development levels to acquire practical and theoretical foundations to meet the demands of a modern coaching environment. Effective coach education and short courses can serve as an important bridge between theory and practice and also help many non-academic practitioners understand research. From a personal viewpoint I conclude by stating: there is value in short courses the same as there is significant value in higher education and the valuable research that provides the underpinning of elite practice.
http://sini.co.uk/2014/03/new-research-highlights-the-growing-use-and-perception-of-performance-analysis-in-irish-sport/

 

I Don’t Create Stats Just Because I Can!

Today 05/03/2014 I engaged in a morning time twitter conversation with @SimonGleave and @CPAUWIC that was based around the question of small or big data and how do we choose the right KPI’s.

The question will have many different answers and be significantly impacted on the environment and context in which data is being or to be collected. While many grass roots sports can function and run smoothly with no formal performance measurement tools the further you progress and in many cases wish to progress the process of performance analysis can be indispensable.shutterstock_117550120 (2)

One of the key challenges for us all is deciding what to measure. For me the priority here is to focus on quantifiable factors that are clearly linked to the drivers of success. Bear in mind that I don’t see quantifiable the same as statistics. While statistics of performance are among the most widely used, non-statistical measures can be just as important. For example in football positioning of units such as your midfield during transitions from attack to defence and vice versa may be the success factor in assuring adequate defensive cover or support in attack, yet this is never displayed as ‘sexy’ stat. The quantifiable measure here is against the desired game plan. So before we jump in and eagerly create our analysis process based on collecting lots of statistical data consider the discipline of performance analysis combines data mining and research with strategy, understanding of athlete behaviour, and partnering with the coaching/fitness/medical team to improve performance.

A recent blog by Keith Lyons quoted from an article by Geraint Lewis and Chris Powers:

“it is probably unsurprising that many fields are awash with poor, inefficient codes, and data-sets too extensive to be properly explored.”

We can so easily drown our focus in a maze of data and actually lose sight of the fact it’s sport with many variables. While physics is by far no strength of mine I often consider the quantum physics observer affect that refers to changes that the act of observation will make on that being observed. Does the fact an athlete/coach/official knowing they are being observed change the behaviour? Do stats infleunce athletes when decision making? Will they make more wrong decisions in an attempt to improve his/her stats?

So for me I don’t look to create numbers just because I can I look at many of the tactical, technical, physiological and psychological factors then seek to identify the interconnections between them relevant to our gameplan in search of the ‘game-changers’. Once these are established I ensure the data is shared with the people that have the knowledge to change behaviour i.e. the coach, S&C, Phsyio etc.

Internships, The Debate goes Mainstream but is Still Blunt!

Internships in sport, in particular football have been the focus of many twitter and other social media outlets on and off over the past year. It is one debate that seems to go around in circles with no real solutions. Today a story emerged via The Independent about Reading FC and a 12 month unpaid internship. When I read through the job requirements my initail thought was one of this is job displacement.

Within the Independent article “A Reading FC spokesman said: “Internships are an important part of career progression and experience building for any individual starting out on the path to their dream job.”

This statement concerns me when the UK Sport website that advertised this post and many more of these unpaid and paid internships states the following job requirements:

“Applicants should:

  • Hold (or be in the process of completing) a postgraduate degree in performance analysis or Sports Science
  • Have a performance analysis background, with some previous experience of working in Professional/Semi-Pro Football
  • Be efficient in the use of both ProZone and SportsCode”

The Reading statement and job Specification clearly contradict each other. If the internship system is designed to provide valuable experience, then why would there be a requirement for ‘previous experience of working in Professional/Semi-Pro Football’?

I want to make it clear while Reading FC seem to have become the focal point of an ethical debate the issue is common practice within football. And for the purpose of clarity, I have personally advertised opportunities for unpaid work experience in the past but have now taken the step to cap the period, pay expenses and/or pay time worked.

An internship is meant to be a system of on-the-job training for graduates to gain highly valuable professional experience, this has been the defence put up by many over the past year. I am an employer and regardless of the post, if on-the-job training whereby a person requires full supervision took 12 months (full time) then I would be have serious reservations about ever employing someone again. Once someone is working set hours, left unsupervised and working to deadlines then case study suggest this person is a worker and must be paid minimum wage, at what stage is this status met by unpiad interns? 3, 6, 12 months?

My personal opinion on a possible solution edges towards if there has to be unpiad internships then they should last no longer than 12 weeks. This is ample time for interns to gain the highly valuable on-the-job training that will allow them to undertake unsupervised work and become a worker/employee. Organisations should not be allowed to apply a cyclical process, internships expiring with no employment outcome should not be allowed to be replaced by another intern within the next 6 month period.

If someone has demonstrated the committent to study for a degree and take onboard the debt that this often brings then do they not derserve a little recognition for the contribution they bring to the working environment?

Continue reading

Performance analysis, is it drowning in raw useless data?

At elite sports level sports performance data is an extremely valuable commodity and one that very often generates huge amounts of revenue for organisations such as OPTA and Prozone. It is also seen by many managers, coaches and athletes as extremely valuable in improving performance. As technology continues to advance at alarming rates there seems to be no end to the increasing ease to which we can acquire raw performance data. Having said that, there is a problem with this, raw data is useless. Furthermore, the vast volume of sport related data is starting to mirror what the business world call ‘big data’ and therefore it is starting to require more and more sophisticated tools and exceptionally skilled personnel to mine it into meaningful information. The next issue will inevitably be whether these tools and in particular the skilled people using them have the knowledge to verify and contextualise the information to continue to provide managers, coaches and athletes with the competitive edge it promises. I personally believe significant issues exist with information management in sport, even to go as far as to suggest the world of elite sport is starting to go off course in so much as the management of performance data may not be appreciated in the context of establishing a ‘target audience’.

database

We as analysts collect data and are expected to create meaningful information for everyone exposed to it. A recent tweet by @CPAUWIC took me to an article on data visualisation and how we are all often fooled by ‘sexy’ dashboards presenting data that may be of no real use. The fact remains data cannot be understood until it analysed, once the analysis process commences it starts to combine pieces of raw data that start to tell a story giving birth to new information. The visualisation and presentation of this information will have a major bearing on understanding it, remember, it is this information that a manager, coach and athlete will hopefully understand in a way that helps them improve sporting performance. As already suggested the volume of the data being acquired is in some cases astonishing and as a consequence this plants seeds too many questions, two of mine would be, do we really understand the value of some of this data? And who is it valuable to? Reflecting on these questions I tend to ask myself further questions, at what point will our athletes start playing the data rather the sport itself? Are we already seeing the effects of athletes being more concerned about the data than the sport?

A word that consistently seems to crop up during pre-match and post-match interviews is ‘balance’, (“getting the balance right”). The same can be said of data, have we got the balance right? Are we providing athletes with too much data and information? I ask myself if one does not have the knowledge to comprehend or react effectively to the data and information then what is the value in providing it to that person. For example, if an athlete runs 5.9 miles during a match yet has no knowledge of conditioning then how does the athlete knowing this help him/her? If the measure of success during a match is the team that run the most miles win, then yes, there is value in presenting this data in such a raw context, although the fact remains, most field based sport is usually about scoring more points than your opponent.  So how do we create value in such data so athletes resist the urge to concentrate on improving specific elements of the data during their performance to the detriment of the overall performance?

It’s seems to be in the human psyche to want to know everything which means data management, mining data and finding ways for the data to work for you is a growing challenge facing sport. Manchester City recently displayed through the release of MCFC analytics that they feel the best way to get value from their data is to give it away. Are they drowning in their own data? Is this truly part of an open data policy? Or is it simply a way to see if anyone out there can create more meaningful information with the data than they currently can? When I first looked at this data I was amazed, In fact, there was so much data it was a little intimidating, where would you even start to produce meaningful information. I then considered the fact this was the small data file and that too much information affects our ability to make decisions and progress forward. I could only imagine how it is so easy to be overwhelmed and paralysed by the vast volume of data.

Sport has never been or will ever be a science. Athletes, coaches and managers were never meant to be scientist, therefore, in my opinion those that develop a balanced information management system that identifies the value of specific performance data and filter such data to the correct targeted audience will be the ones that gain the most competitive advantage from performance analysis as it continues to evolve. When I consider some of the demands on the modern athlete a quote by Idreis Shah comes to mind “People today are in danger of drowning in information; but, because they have been taught that information is useful, they are more willing to drown than they need be. If they could handle information, they would not have to drown at all.”

As someone that dedicates the majority of my time applying video based performance analysis at the grass roots and intermediate levels of sport I see many benefits of introducing performance analysis into your coaching process. I constantly ask myself can the grass roots and development levels of sport afford to ignore performance analysis altogether? Every time I conclude no. I think of the mass volume of data and information produced and circulated and the knock on effect it is having on the grass roots and development levels of sport. Many people see all these ‘sexy’ dashboards, analysis blogs and media printed performance reports and are simply afraid of introducing what seems to be a very complex process into their coaching. The pace at which the grass roots and intermediate levels of sport take up analysis techniques will depend on how quickly we can convince the sceptics that there are benefits and that you don’t have to be an IT genius to introduce performance analysis into your coaching process.

The good news is that ‘infobesity’ can be dealt with and that doesn’t require taking the extreme Tacticsmeasure of not embedding performance analysis into your coaching process at all. It just takes you to really establish what’s important and what’s not. Start by creating a list of everything you want to know about your athlete’s performance. Outcomes of passes, tackles, shots, distances, runs, positioning etc. really go for it and make the most extensive list you can. Now for the important bit, the culling of data, define what matters to you, how important is each data type to your playing philosophy, eliminate all those that come low on your list of importance. Now start to capture some data, analyse it and monitor its value, are you getting a performance edge from acquiring it? If something isn’t providing you with an edge or improvement, delete it from the process. If you keep collecting it and you are getting no value from it you will still feel obligated to look at it, this could affect your ability to make decisions. Only provide the information to those that can comprehend it and gain value from it, think of what your different specialist coaches need, your S&C coach, your medical staff and your athletes, remember, less is more. I leave you with a quote from Herbert Simon “What information consumes is rather obvious: it consumes the attention of its recipients. Hence, a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it”

The start of an exciting journey as a performance analyst!

I met Clifford Adams in March this year (2012), he had enrolled on our pilot level 3 award in sport performance analysis course in Belfast. I can remember our first session as if it was yesterday, there was a blend of experience from Ulster hockey, football, rugby and GAA backgrounds. While Clifford had an impressive football coaching background he openly informed myself and his fellow students of his limitations when it came to technology.

Throughout the course Clifford demonstrated a motivation and appetite to develop these skills to ensure he could produce some analysis processes and workflows at his club. It was clear this was difficult at times but he seemed to enjoy every moment of the challenge. Clifford soon got a handle on the technology and the course provided him with a good foundation in video analysis tools and how to develop a process relevant to the level and age grades he was intending to apply his new skills.

For the final assignment of the course Clifford pulled together an analysis feedback session in the format of a presentation to present to his peers. While it started a little shaky it quickly progressed and ended up what can only be described as ‘punchy’ and ‘on point’, needless to say he passed the course with flying colours.

Since completing the course I am glad to say it didn’t stop there for Clifford. At the time of commencing the course Clifford was coaching with the reserve team at Ballymena Utd in Northern Ireland and on completing the course he was offered the role of 1st team analyst. As I had witnessed during his time as a student with me Clifford was excited by the challenge and without hesitation took on the new role.

I provided some interim support to Clifford to help him source some suitable applications to capture the data he had agreed with Glen Ferguson the 1st team manager at Ballymena Utd. Clifford started to use Nacsport a windows platform timeline based analysis application that was provided through AnalysisPro along with iPad apps TAGIT football, Coaches Eye and Ubersense to establish some workflows for competition and training.

I have since became good friends with Clifford who joined me recently on a fact finding mission to Wales when I embarked on gaining an insight into some analysis processes being applied throughout various levels of rugby in Wales. Again Clifford’s appetite to learn and evolve his own practice was all too evident. Clifford is hungry for continued success and has not only established some impressive workflows for the 1st team at Ballymena but also established a team around him to capture data from the reserve and academy squads.

This season so far has proven to be an indifferent one for the Ballymena Utd 1st team, while they are progressing in terms of league position on historical performances they have also suffered two heavy defeats this season. The difference this time around is Clifford can now objectively analyse performance and provide the manager and players with accurate information that helps them understand some of the reasons why things happened as they did. This certainly seems to be having a positive effect, personally I am not a great believer in coincidence and more of a believer in things happen for a reason, the fact that Ballymena Utd won their first Trophy in 23 years on Tuesday 27th November against Linfield FC is in my opinion the aggregation of a new progressive manager in Glenn Ferguson, the implementation of an effective analysis process and the buy in from the players and by no means a coincidence.

This story is evolving into a great case study for me and I will pay close attention to future results and progress of Clifford and all the squads at Ballymena Utd. When I reflect on the ups and downs so far this season for the team at Ballymena Utd where there has been some good results and two notable defeats of 6-0 and 8-0, and yes, the 1st trophy in 23 years, I think of a quote by Henry Ford that really sums up the teams progress so far this year “coming together is a beginning; keeping together is progress; working together is success”

I forgot to mention that Clifford (left) is a lifelong fan of Ballymena Utd, the picture is hopefully the start of the dream for Clifford, I have no doubt he can continue to apply is analytical skills to help the team win many more trophies.

The route to being a better performance analyst?

I spent the weekend facilitating a level 3 sport performance analysis course that was in one sense challenging and another extremely rewarding. The challenge lay in the diversity of experience within the group that included people who have experiences at the elite level to people that had no experience what so ever.

While I was faced with the task of fashioning an effective way of disseminating my knowledge, practices and ideas about the topic that had value for everyone the students themselves had to overcome the barrier of sharing their own knowledge and practice. It was not easy but certainly rewarding and as usual I personally learnt as much as I may have passed on to others. While the weekend was a success it made me reflect on how do I become a better performance analyst?

There has for many years been a barrier into the world of video based analysis in that much of the tools and resources have been premium priced eliminating many from accessing them at the grass roots and development levels of sport. This excuse has at last been gradually eliminated with some radical costing structures from a leading timeline based software supplier combined along with some free applications. These resources have opened up the market ensuring we can produce exceptional workflows for everyone regardless of the level you compete at.

The software factor was crucial in moving forward, for me the route to being a better performance analyst now belongs to practitioner. We have at last wrestled the focus away from software suppliers and onto workflows, education and sharing good practice. Having said that, a challenge remains, are we as a sector open minded enough to see beyond the risks of sharing our knowledge, skills and resources? Well, let us have a look at what has been happening.

Dave Roe created a network through LinkedIn called Performance Analysis in Sport that is rapidly heading towards 3000 members. The network provides a foundation to familarise yourself with others working in or interested in performance analysis in sport.

Josh Bryan created Visual Performance Analysis website intended to give a platform for practitioners and those interested in analysis to share stories and convey a community voice of a growing industry. Josh creates lots of interesting content that is opening up the sharing of resources and workflows.

Most recently Keith Lyons pulled together a great group of educators to facilitate the open learning OAPS101 course that has, since its launch, brought together a vast and diverse community to share practice. The topics have to date provoked great discussion and importantly encouraged the sharing of practices.

My reflection led me to understand the success of the weekend was in the fascination I and others had in the different responses to various tasks, discussions and questions posed throughout the two days. The experiences, opinions and ideas being presented were valuable to us all, regardless of age, experience, existing knowledge and skills, we all learnt something new and very often learnt something old simply being applied in a different way.

Diagram shows the overlapping interdependencies of a community of practice.

I concluded the route to being a better performance analyst is in my opinion in shared practice. It will be difficult for many and at times painful, it will require an openness and preparedness to change. If we are successful it will transform us into a real community of practice.

How do we get a performance edge from our analysis and data?

This is my first blog and hopefully I pose some very important questions for you regardless of experience and knowledge in performance analysis. Your comments, difference of opinion are welcome. So lets get started…

The world of analysis applications and tools is at last starting to become a highly competitive market driving down the cost of technology in sport and making it accessible to more and more. Hughes and Franks stated “computers and video taping tools allow for almost limitless storage, retrieval and analysis of data” (Hughes, M. and Franks I.M. 2008) which is clearly a positive in the context of coaching. Having said that, there is a significant drawback in the form of information overload or ‘infobesity’ as it is becoming known. I am sure there are many elite level sports today struggling with the large amounts of data they have and are asking themselves how do we get a performance edge from our analysis and data?

The twitter birdAn issue is emerging in that the large amounts of information that are available could prove to be more of a negative rather than positive. I am currently part of an open learning community that has immersed me into the concept of augmented reality.  I reflect on the augmented reality concept we are discussing, and while it has excited and fascinated me, I wonder where the edge may lie with this type of information in a live competition environment. I can see some obvious benefits in the development and improvement cycle that you may say provides a real time benefit in itself, but I am still struggling with the actual in game real time benefits outside of media visualisation. I am getting concerned that we will spend vast amounts of money, resources and time to merely manipulate performance data and lose sight of the ultimate goal, improving performance and ultimately attaining a real time advantage during competition.

When I think of the vast amounts of data being recorded at the elite level and consider the concept of infobesity I reflect on several conversations I have had and processes I have observed at the grass roots. Very often I come across people applying analysis at this level effectively and on the other side of the coin I come across those that are struggling with it. Many new to analysis have received coaching on the software packages they have been sold but have no proper analysis knowledge and launch themselves into gathering vast amounts of data expecting to improve performance. Make no mistake about it those that introduce performance analysis into their coaching are expecting quick and continuous performance improvements. They immediately feed vast amounts of diverse information to athletes over a short period of time and see no tangible benefits from the analysis or witness any performance improvements. Are we seeing the result of infobesity? And does the concept have the potential to start to have a declining effect on performance? For me, the difference between those applying analysis effectively and those struggling with it seems to be in the balance struck in filtering information to coaches and athletes.

So I ask, if we start to capture more and more real time and performance data how do we process the different streams of data to bring them back together into a composite answer that tells us what we need to do to improve the performance? And can these improvements really be made in real time? So my first blog has posed more questions than answers.

My advice to those new to analysis is, despite the temptation, you must resist the urge to hurl yourselves into gathering vast resources to capture performance data without first understanding how you can use it to improve performance.

For those with experience I leave you with a quote from Derek Lin, author of a very good read The Tao of Success: The Five Ancient Rings of Destiny.

“The blind pursuit of learning leads to excessive desires—the more you see, the more you want. Excessive desires, in turn, lead to anxiety and misery.”

References: Hughes, M. and Franks, I.M. (2008) The Essentials of Performance Analysis: An Introduction. Oxon: Routledge