
ISO 16710:2024
ISO 16710:2024 Ergonomics methods – Part 1: Feedback method – A method to understand how end users perform their work with machines
CDN $273.00
Description
This document describes the “Feedback Method”, a method designed specifically to collect the contribution of machinery end-users by reconstructing and understanding how work is actually performed (i.e. the real work). This method can help to improve technical standards, as well as the design, manufacturing, and use of machinery.
By collecting the experiences of skilled users, this method can be used to reconstruct their actual work activities under different operating conditions and with any kind of machine. This helps to identify all the critical aspects having an impact on health and safety, or associated with ergonomic principles. Moreover, it makes it possible to identify some basic elements for defining the standards for machines and for their revision and improvement. It can also improve production efficiency and identify any need for additional study and research.
The method is designed to minimize the influence of the subjectivity of the facilitators and researchers in reconstructing and describing the reality of work, and to maximize the “objective” contribution of the skilled users of the machine.
The method combines a high level of reproducibility, sensitivity, and user-friendliness with low demands in term of resources, which makes it attractive to micro, small and medium-sized enterprises.
This document is addressed to standards writers, designers and manufacturers, employers-buyers, end users, craftsmen and workers, market surveillance and authorities.
Edition
1
Published Date
2024-09-30
Status
PUBLISHED
Pages
31
Format 
Secure PDF
Secure – PDF details
- Save your file locally or view it via a web viewer
- Viewing permissions are restricted exclusively to the purchaser
- Device limits - 3
- Printing – Enabled only to print (1) copy
See more about our Environmental Commitment
Abstract
This document describes the “Feedback Method”, a method designed specifically to collect the contribution of machinery end-users by reconstructing and understanding how work is actually performed (i.e. the real work). This method can help to improve technical standards, as well as the design, manufacturing, and use of machinery.
By collecting the experiences of skilled users, this method can be used to reconstruct their actual work activities under different operating conditions and with any kind of machine. This helps to identify all the critical aspects having an impact on health and safety, or associated with ergonomic principles. Moreover, it makes it possible to identify some basic elements for defining the standards for machines and for their revision and improvement. It can also improve production efficiency and identify any need for additional study and research.
The method is designed to minimize the influence of the subjectivity of the facilitators and researchers in reconstructing and describing the reality of work, and to maximize the “objective” contribution of the skilled users of the machine.
The method combines a high level of reproducibility, sensitivity, and user-friendliness with low demands in term of resources, which makes it attractive to micro, small and medium-sized enterprises.
This document is addressed to standards writers, designers and manufacturers, employers-buyers, end users, craftsmen and workers, market surveillance and authorities.
Previous Editions
Can’t find what you are looking for?
Please contact us at:
Related Documents
-

ISO 17049:2013 Accessible design – Application of braille on signage, equipment and appliances
CDN $115.00 Add to cart -

ISO 24509:2019 Ergonomics – Accessible design – A method for estimating minimum legible font size for people at any age
CDN $312.00 Add to cart -

ISO 17097:2024 3-D human body scan data – Methods for the processing of human body scan data
CDN $173.00 Add to cart -

ISO 17069:2020 Accessible design – Consideration and assistive products for accessible meeting
CDN $173.00 Add to cart







