Download as xls, pdf, or txt
Download as xls, pdf, or txt
You are on page 1of 2

5S Routine Audit Form

Audit Date: Auditor(s): Scoring Legend Category


T

Area Audited: Area Rep(s): Yellow 50%-69% Red <=49% 2 3 1 4 0 5

If item is not applicable to the # of Problems 5 3-4 area, score N/A and do not Score include in the final total 1 2 Item N/A Distinguish between what is needed and not needed Are unneeded equipment, tools, furniture, etc. present in the area? Are any Red Tagged items more than 3 weeks old? Are personal belongings properly stored? A place for everything and everything in its place Are aisle/walk ways and workstations clearly marked and identified? Are jigs, fixtures, tools, equipment, & inventory properly identified and in their correct locations? Are items put away after use? Are there max. and min. indicators for supplies? Cleaning and looking for ways to keep the workplace clean/organized Are cleaning materials easily accessible & properly stored Are equipment and work station kept clean and free of oil, grease and debris? Are designated walkways/stairs free of dirt, oil, grease and dust? Are lines, labels and signs clean and unbroken? Maintain and monitor the first three categories Are display boards used, organized, current and tidy? Are employees dressed appropriately and prepared? Have specific cleaning tasks been assigned? Are trash bins and scrap/recycle containers emptied on a regular basis? Stick to the rules Is the 5S program discussed at Key Indicator/Crew Meetings? Are the tools in place to sustain the 5S program? Overall, is the area maintaining 5S rules and disciplines?

Green >=70%

SY

ST

N SU ST A IN

D A

D IZ

ST C EM LE A A TIC N IN G

SI

M PL IF Y

SO R

TOTAL % SCORE
Comments: Concerns Plus Points

/ %
Suggestions

At first glance, the audit sheet may seem confusing but with practice it becomes easier to use. One of our major concerns was for the 5S initiative was consistency. There are strong auditors, weak auditors, and auditors with different focuses. To remain as objective and consistent as possible, we implemented a scoring system based on a tally of problem areas. In other words...

# of problems = 0 then Score = 5 # of problems = 1 then Score = 4 # of problems = 2 then Score = 3 # of problems = 3-4 then Score = 2 # of problems = 5+ then Score = 1 Category = N/A then Score = N/A

Using "Sort" as an example, there are 3 categories to review. If all 3 categories are relevent/applicable to the area, then their score is out of 15 (i.e., with a max possible score of 15/15 and a min score of 3/15) If they do not have a red tag area (as they have already dealt with their red tag items and no more currently exist), their score is out of 10 (for a max score of 10/10 and a min score of 2/10). If an item scores an N/A, it does not contribute to the overall score in anyway. Each 'S' attains it's own individual score so that an area can see which 'S' needs to be worked upon. The final score is attained by (total points allotted [excluding N/A's])/(sum of the denominators from each 'S').

You might also like