WARNING 1: This site contains traces of orthodox safety stuff which can adversely affect safety performance with overuse. We hope you can open your mind a just a little and spend some extra time discovering the more positive and effective people, safety and risk concepts that we much prefer sharing with you.                               WARNING 2: DO NOT EVER CLICK HERE




Risk Psychometrics, Spin and Snake Oil

by Dr Rob Long · 5 comments

in Psychology of Safety and Risk,Robert Long,Zero Harm

Risk Psychometrics, Spin and Snake Oil

Latest article by Dr Robert Long from www.humandymensions.com.au.

A great quote from the article:

So, let us name safety psychometrics for what it is. Safety psychometrics is workforce eugenics as if humans are not human, as if the absolute of zero is cause for absolute control of humans, as if risk is the enemy, as if fear is how we should live and as if learning and imagination are not required for the creation of safe workplaces. If one wants a human workforce, then robots won’t do.

ENJOY:

Risk Psychometrics, Spin and Snake OilThe recent boom in sales marketing by companies promising to eliminate risk takers and unsafe people from the workforce shows that there is good money to be made from spin and safety snake oil. The problem is not only that claims made by these companies are founded on eugenic assumptions and pseudo science but more, such ideas are dangerous, non-creative and anti-learning.

There are several tests to assess the validity and presence of snake oil. Try these:

1. One of the best ways to assess snake oil promotion is to consider the by-products of such promotion. Don’t be blindsided by wanting to believe the promises, what is hidden? What by-products are hidden by the promises?

2. Another test of snake oil promotion is to see if the principles, assumptions and practices of such spin are applied to executives and advocates. If not, why not? It seems that snake oil is always good for ‘other people’.

3. Assess whether what is being proposed is simple and ‘too good to be true’. It is likely that it is. Humans are complex and any proposal that is simplistic is wishful thinking.

4. See if what is proposed applies any sense of absolutes or perfectionism to fallible human beings.

5. Also, see how much certainty is guaranteed by what is proposed, the level of control proposed and, if what is proposed assumes about human nature, choice and freedom.

6. Think about what kind of organization would result from a roll out of psychometrics. Who would then be running the organization? What level of determinism is embedded in such an organization?

7. How much is the attraction of the proposed snake oil a wish for an easy solution to a complex problem? Is this why it is so attractive? How much is the snake oil marketed to your problem? Is it ‘off the shelf’, that’s easy? How does ‘one size fit all’?

8. Another test of snake oil is, testing trajectory, that is, where is this going? What is often proposed in the short-term sounds good but in the long term is in fact dangerous and destructive. So, the first attribute of any safety professional needs to be the ability to be longsighted.

 

9. The test of ethics should be applied to all snake oil, not just for the moment but, for the long haul. What values are hidden in the small print? What beliefs and assumptions are hidden in the subtext? For example, is manipulation of human populations ethical, even thought the proposed practices have a good intention?

10. The test of ideology should also be applied to the promises of snake oil marketing. Is what is proposed an ideology? That is, an all controlling perspective that becomes an unmovable system in itself, that cannot be challenged? So much so, that any challenge is made into a binary opposite such that opposition is understood to be an evil in itself?

When it comes to some of the recent marketing in psychometrics in risk, one has to wonder why there is such a market for this stuff. Have we reached such an end in the road in risk aversion that we now have to engineer (eugenics) the human workforce in our fear of risk and quest for zero? What is the by-product of trying to engineer out risk from a population? Why has such an honourable aspiration to reduce harm led to the manipulation of the workforce?

There is no learning without risk. The quest for risk aversion is the quest for non-learning. If we seek to take risk takers out of a workforce, we seek (as a by-product) to have a non-learning, risk averse, robot-like workforce that can’t think, are blindly obedient and unable to innovate, create and imagine. This is what psychometric engineering (eugenics) proposes. I have written about this previously (http://www.safetyrisk.com.au/safety-eugenics-and-the-engineering-of-risk-aversion/ ). Of course, without imagination, creativity and innovation, we make a less safe workplace. If one takes imagination out of the workforce, how can we hope to manage the uncertain (risk) and the unexpected?

When we look at the promises of recent marketed safety psychometrics we must ask ourselves what kind of humans we want at work? We must think what attributes are taken out of the workforce in this paranoia and fear of risk?

Recent research by Davidson and Begley (The Emotional Life of Your Brain) shows that much of what is assumed in psychometrics is not matched by research in neuroscience. Whilst some aspects of psychometrics are helpful, it has never been accepted (by scholars of psychology) that psychometrics be used as some kind of engineering (eugenics) foundation for manipulation of a population.

So, let us name safety psychometrics for what it is. Safety psychometrics is workforce eugenics as if humans are not human, as if the absolute of zero is cause for absolute control of humans, as if risk is the enemy, as if fear is how we should live and as if learning and imagination are not required for the creation of safe workplaces. If one wants a human workforce, then robots won’t do.



Risk Psychometrics, Spin and Snake Oil

Dr Rob Long

Social Psychologist, Principal & Trainer at Human Dymensions
Risk Psychometrics, Spin and Snake Oil

Latest posts by Dr Rob Long (see all)

Risk Psychometrics, Spin and Snake Oil
PhD., MEd., MOH., BEd., BTh., Dip T., Dip Min., Cert IV TAA, MACE, MRMIA Rob is the founder of Human Dymensions and has extensive experience, qualifications and expertise across a range of sectors including government, education, corporate, industry and community sectors over 30 years. Rob has worked at all levels of the education and training sector including serving on various post graduate executive, post graduate supervision, post graduate course design and implementation programs.
  • jwa;la sharma

    Senior managers often just do not have the time to investigate safety processes and philosophies, so they rely on what safety managers sell them. I believe part of the problem is that the safety managers get so caught up in running their safety systems that they do not leave time to research and read up, making them easy targets for snake oil salesmen.

    I work in R&D, and our focus is mainly technical. The typical philosophy is “you either develop or buy”. You either buy what you do not want to develop, and focus on what you want to develop. You then develop what you cannot buy, or what you do not want to buy. Deciding what to develop, what to buy and what to ignore is the core of research management. (Only once this decision is made, real development takes place.)

    In this context, very few companies who are focused on development are prepared to develop their own safety systems and philosophies, since that would remove resources from money-making development. Your average company would therefore buy their safety systems (and safety philosophies). Enter the salesman. A company would buy what is presented to them as being the best system, and “best” is often defined as “world practice”, “successfully implemented by company X”. Problem with that is if industry is misled, then the unsuspecting follower will also be misled. Furthermore, what works for company X may not work for company Y, since (among others) the differences in culture may cause failure.

    I also believe if a company made a mistake, they will seldom admit it, but will tend to “make it work”. It may be possible if the system is not inherently flawed, therefore the systems (like OHSAS 18001, for example) may well be worth the money and effort. On the other hand, combining it with “Zero Harm” may well create a system that is systematically sound (inspections, audits, checklists), but the human factor needed to support the system in not on board. Result – good audits, but still not really safe.

  • Wynand

    I absolutely agree with Jim. Senior managers often just do not have the time to investigate safety processes and philosophies, so they rely on what safety managers sell them. I believe part of the problem is that the safety managers get so caught up in running their safety systems that they do not leave time to research and read up, making them easy targets for snake oil salesmen.

    I work in R&D, and our focus is mainly technical. The typical philosophy is “you either develop or buy”. You either buy what you do not want to develop, and focus on what you want to develop. You then develop what you cannot buy, or what you do not want to buy. Deciding what to develop, what to buy and what to ignore is the core of research management. (Only once this decision is made, real development takes place.)

    In this context, very few companies who are focused on development are prepared to develop their own safety systems and philosophies, since that would remove resources from money-making development. Your average company would therefore buy their safety systems (and safety philosophies). Enter the salesman. A company would buy what is presented to them as being the best system, and “best” is often defined as “world practice”, “successfully implemented by company X”. Problem with that is if industry is misled, then the unsuspecting follower will also be misled. Furthermore, what works for company X may not work for company Y, since (among others) the differences in culture may cause failure.

    I also believe if a company made a mistake, they will seldom admit it, but will tend to “make it work”. It may be possible if the system is not inherently flawed, therefore the systems (like OHSAS 18001, for example) may well be worth the money and effort. On the other hand, combining it with “Zero Harm” may well create a system that is systematically sound (inspections, audits, checklists), but the human factor needed to support the system in not on board. Result – good audits, but still not really safe.

  • http://schmuelsons.wordpress.com Schmuelsons

    Fear inducing, blind obedient, maintaining the status quo in safety, anti learning, risk aversion, lack of human behaviour studies, and yes, the snake oil. I think it’s not science, just an effort to complicate the supposedly a straight forward matter, safety.

  • Jim Loud

    Excellent article and all too true! Unfortunately management often buys this quick fix expensive snake oil because it is recommended to them by their safety “professionals.” We don’t treat other important organizationsl objectives (e.g., production, quality, etc) with paint by the numbers “solutions”. Why is safety treated so differently?

  • George Robotham

    Just goes to prove bull shit is alive and well in the safety world

Previous post:

Next post: