[Learning Never Ends] Audit Materials Required for ISO-Certified Enterprises


Certified enterprises must make the following full preparations before the certification audit to improve the pass rate and gain the respect and learning of the auditors.

1. Establish basic data files

1. Relevant enterprise certificates include but are not limited to: industry qualifications, business licenses, permits, 3C, health approvals, safety permits, environmental acceptance approvals, safety evaluation approvals, etc.;

2. The following certificates must be valid: calibration certificates for measuring instruments, special equipment inspection certificates, and work permits for personnel with special requirements;

3. Whether the actual number of personnel matches the number on the certification contract or audit notice, and whether there is any difference? If there is a difference, please contact the certification body immediately, otherwise it may affect the audit and the validity of the certification certificate;

4. Must be clear about the release date of the company's management manual and procedure documents, and must hold: a valid and latest version of the management manual and procedure documents. In addition, supporting documents such as management systems, technology, and regulations should also be well preserved. Manuals and procedures should be distributed to relevant personnel (distributed documents can be electronic versions, but at least 2 sets of paper documents should be available on site);

5. Enterprise management personnel should be familiar with management policies and objectives, understand how to assess and inspect related objectives, and grasp the completion status of objectives; the division of responsibilities of department heads must be clear and consistent with the manual. Department heads should clearly know their responsibilities; organize and improve the enterprise's product inspection records (or construction acceptance records), and other relevant production or construction records. For products with mandatory national requirements, regular product inspection reports issued by third-party agencies must be available; if special processes are defined in the management manual, process confirmation records must be available;

6. Know and understand the relevant information and process of internal audit (internal review), including but not limited to:

(1) The general manager, management representative, and internal auditors must know the timing of internal audits and management reviews;

(2) Keep and be able to provide at any time: internal audit records, internal audit nonconformity reports, internal audit reports, management review reports, and reports from various departments. The timing of management reviews, whether improvements in the management review report are consistent with reality (cannot be identical to the last time), and what promotion work the management representative has done must be clearly explained;

2. Establish files for the operation of 20 systems (Content can be flexibly added or reduced according to the actual situation of the enterprise)

1. Document control

A. Approval, distribution, and modification of internal documents:

1) Engineering drawings issued and used without approval;

2) Work instructions not distributed to specific work positions;

3) Work instructions hung at production sites are uncontrolled;

4) Process documents are directly changed on the document without following the document change procedure.

B. Identification, collection, and distribution of external documents:

1) Failure to fully identify and collect national/international and industry standards related to the product;

2) Failure to distribute external documents to relevant departments, such as quality control and production departments.

2. Filling, management, and preservation of quality records

1) Quality records show signs of alteration;

2) Quality records do not specify retention periods;

3) Records are not kept according to retention periods, and destruction records cannot be provided upon expiration.

3. Statistics and analysis of quality objectives

1) Quality objective statistics cannot provide original data, making it impossible to verify the authenticity of the final statistical values;

2) Quality objectives are statistically recorded but not analyzed.

4. Management review

1) Management review input information is incomplete or input materials cannot be provided;

2) The management review host is not the top manager, and authorization proof from the top manager for the host (if not the top manager themselves) cannot be provided;

3) No evidence of measures taken for management review resolution items, such as corrective or preventive actions;

4) No records of follow-up results for the last management review resolution items.

5. Human resource management

1) Failure to specify responsibilities, authorities, and competency requirements for each position according to actual posts;

2) Training is planned and implemented according to plan, but the effectiveness of training implementation is not evaluated;

3) No competency requirements specified for special positions, and no evidence of training and assessment for these personnel;

4) Requirements for annual qualification review of special operation personnel (electricians, welders/cutters, crane operators, etc.) are not timely tracked, and some special operation personnel's qualification certificates are not reviewed annually or have expired.

6. Infrastructure management

1) New production equipment is put into use without acceptance;

2) No maintenance and upkeep requirements specified for equipment;

3) No evidence of regular inspection for special equipment.

7. Work environment management

1) For sites with temperature and humidity requirements, no thermometer or hygrometer is available, making it impossible to monitor temperature and humidity conditions;

2) Lighting at color difference inspection posts is not a dedicated inspection light source and does not meet requirements;

3) Production and storage sites have dust prevention requirements, but dust is found on products stored on site.

8. Product realization planning

1) Quality objectives are not formulated according to product categories or characteristics;

2) Although product realization planning is conducted, the materials are scattered and disorganized, and the responsible persons are not familiar (or even unaware) of the requirements for product realization planning;

3) Modifications to related documents caused by engineering changes were not executed according to the approval procedures, with unauthorized changes occurring; some related documents were partially modified while others were not, resulting in incomplete modifications;

4) Quality control points of the product were not planned, and the timing for verification, validation, monitoring, measurement, inspection, and testing was not determined.

9, Processes related to customers

1) Legal and regulatory requirements related to the product (including national/international, industry standards, specifications, etc.) were not identified or insufficiently recognized;

2) Post-delivery activities of the product (including measures stipulated in warranty clauses, contractual obligations [e.g., maintenance services]), as well as additional requirements deemed necessary by the company (e.g., recall or final disposal), were unclear;

3) When customers did not provide documented requirements, there was no evidence of confirmation of these customer requirements; verbal contracts were not reviewed;

4) The enterprise failed to plan contract review operational requirements based on its own business process characteristics, resulting in a formalistic approach without practical significance;

5) When product requirements changed, the changed requirements were not promptly communicated to relevant personnel;

6) Customer feedback (including customer complaints) was handled, but the results of the handling were not communicated to the customer.

10, Design and Development

1) Common issues during design and development planning:

a) Responsibilities and authorities of design team members were not clearly defined;

b) Requirements for design and development progress were not clearly defined, and the design and development plan was not adjusted timely according to progress;

c) Timing for review, verification, and validation activities was not planned during planning.

2) Design and development input information was insufficient, such as inadequate identification of applicable legal and regulatory requirements for the product;

3) Approval of design and development outputs before release was incomplete, for example, drawings only had the name of the preparer, but no signatures from reviewers or approvers;

4) Records of design and development reviews, verifications, and validations were incomplete and not conducted as planned; improvements proposed during these processes were not recorded;

5) After design and development changes occurred, appropriate review, verification, and validation were not conducted as required;

6) Changes in design and development caused changes in related documents, but the related documents were not revised timely, nor were the change requirements promptly communicated to relevant personnel.

11, Procurement Process Control

1) The type and extent of supplier and purchased product control were not determined based on the impact of the purchased product on the final product;

2) Supplier selection and evaluation did not cover all material suppliers and outsourcing parties, especially the evaluation of outsourcing parties;

3) Relevant certification documents provided by suppliers (such as quality assurance certificates, material inspection reports, qualification certificates, etc.) were not updated timely to ensure their validity;

4) Requirements for purchased products were not communicated to suppliers timely or completely, resulting in suppliers failing to supply according to requirements;

5) Quality issues from suppliers were fed back to them, but the effectiveness of corrective actions taken by suppliers was not verified timely;

6) Verification requirements (methods, timing) for purchased products were not clearly defined, resulting in products entering storage without verification.

12, Control of Production and Service Provision Processes

1) Work instructions required at production and service sites were not issued/hung/posted timely, and the work instructions used on site were not replaced timely according to the actual products produced;

2) Faulty equipment was not marked with its status;

3) Inspection instruments and monitoring equipment used on site lacked calibration/verification status identification;

4) Evidence of monitoring process parameters during production was not provided;

5) Operators of special processes were not certified or were on duty without training;

6) Special processes were not validated, and after production conditions changed, special processes were not revalidated;

7) Product status (inspection status, processing status) markings during production were incomplete;

8) Information such as product batch, order number, and production date was incomplete;

9) Product protection was lacking, such as damage to products at the bottom due to excessive stacking height, damaged product packaging, etc.;

10) Customer property was not clearly identified, and abnormalities were not reported to the customer timely.

13, Control of Monitoring and Measuring Equipment

1) Identification of equipment included in the scope of monitoring and measuring equipment control was incomplete, such as pressure gauges/temperature controllers in machines, ammeters/speed controllers and temperature controllers of welding machines, speedometers of conveyors, and other monitoring equipment not included in the control scope;

2) No plan was formed for calibration/verification of monitoring and measuring equipment, and it was not determined whether internal or external calibration was used;

3) Internal calibration lacked calibration/verification specifications and could not be traced to national or international standards;

4) Internal calibrators did not receive professional training and had no internal calibrator qualification certificates;

5) Monitoring and measuring equipment lacked status identification, making it impossible to determine whether it was within the calibration/verification validity period;

6) Protection of precision instruments was insufficient, such as measures against vibration and dust.

14, Customer Satisfaction

1) The methods for monitoring and measuring customer satisfaction were too single, only using customer satisfaction surveys, without considering customer complaints, returns, customer follow-ups, customer evaluation reports of suppliers, and other information;

2) The scope of the customer satisfaction survey is not representative, only important customers were surveyed;

3) Customer satisfaction surveys exist, but no evidence is provided on how this information is utilized, e.g., how work is improved.

15, Internal Audit

1) The scope of the internal audit is reflected in the plan, but the checklist does not fully cover it, especially for clauses explicitly stated in the plan to be audited, which are not reflected in the checklist and records;

2) The arrangement of auditors is unreasonable, failing to consider the auditors' professional competence;

3) The time arrangement in the audit schedule is unreasonable, not considering the complexity and scope of responsibilities of the audited department;

4) Top management did not attend the opening and closing meetings;

5) The nonconformity reports issued by internal audit do not clearly describe the facts of nonconformity, lack traceability, and fail to clearly describe the specific details of the nonconformity;

6) Insufficient rectification of nonconformities: inadequate root cause analysis, unreasonable corrective actions;

7) Follow-up verification of nonconformities was not arranged in a timely manner, and the verification result reports are unclear.

16, Process Monitoring and Measurement

1) Production processes are monitored, but data analysis of the monitoring is insufficient, failing to monitor the capability of the production process;

2) There is no planning for monitoring the system operation process, no monitoring was conducted, only internal audit evidence is available;

3) Insufficient statistics on process performance indicators, failing to grasp process capability.

17, Product Monitoring and Measurement

1) Inspection positions have not received inspection/test operation instructions;

2) Inspectors lack sufficient capability and understanding of the use of AQL;

3) Inspection reports lack sufficient test data; items requiring specific values are left blank;

4) Inspection and testing are not conducted 100% according to the specified items in the inspection/test specifications/standards;

5) Emergency release (or exception release) cases lack evidence of approval by authorized personnel and have insufficient traceability markings;

6) Inspection reports lack signatures from authorized release personnel.

18, Nonconforming Product Control

1) On the production site, nonconforming products generated during the production process are not clearly identified and not recorded in a timely manner;

2) Nonconforming incoming goods are disposed of, but suppliers are not required to take improvement measures; some nonconformity reports are sent to suppliers but their effectiveness is not tracked and verified in time;

3) Nonconforming products reworked or repaired during production are not re-verified; some are verified but records of re-verification after rework/repair are not provided;

4) There are occurrences of rework and repair in the process, but no records are kept of the rework and repair process;

5) There are special approvals (concession acceptance) for materials during production, but no evidence of approval by authorized personnel is provided;

6) Products returned by customers are directly returned to the warehouse without re-inspection or execution of nonconforming product procedures.

19, Data Analysis

1) Customer satisfaction surveys and statistics are conducted, but no evidence of analysis is provided;

2) The quality control department has statistics on pass and fail rates, but no evidence of analysis of nonconformity conditions is provided;

3) There is insufficient understanding of the requirement for data analysis of process performance; only statistics and analysis of production process performance such as rework rate, repair rate, and scrap rate are available, but there is a lack of data analysis evidence for process performance in other departments; (this can be combined with statistical analysis of quality objectives at various functional departments and levels.)

4) Incoming goods have statistical analysis of incoming quality rate and on-time rate, but no analysis of individual suppliers' supply capability;

5) Statistical analysis of quality objectives only analyzes items that did not meet target requirements, lacking data analysis for those achieved, and fails to seek opportunities for preventive measures;

6) The use of statistical methods and techniques is narrow, methods are too monotonous and lack scientific rigor.

20, Improvement

1) There is basically no record showing implementation of preventive measures; the timing of preventive measures implementation is not grasped;

2) Regulations on when to take corrective and preventive measures are unclear and arbitrary;

3) Cause analysis in improvement reports is inadequate, superficial, lacking comprehensive and in-depth analysis; (should consider M1E six factors and use the 5 Whys method)

4) When formulating corrective measures, only emergency corrective actions are considered, lacking measures to prevent recurrence;

5) Confusion exists between the concepts of correction, corrective measures, and preventive measures; corrective action reports contain both corrective and preventive measures;

6) Corrective/preventive measures are implemented but results of implementation are not recorded;

7) After completion of corrective/preventive measures, there is a lack of verification of their effectiveness.

END

Statement: The videos, images, and text used in this article are partly sourced from the internet, and copyrights belong to the original authors. If there are copyright issues, please contact us promptly for verification and negotiation or removal.

About Beijing United Intelligence Certification Co., Ltd.

Beijing United Intelligence Certification Co., Ltd. is an important member of the United Intelligence Productivity Group, a well-known international, comprehensive high-tech service organization. It provides technical services in standardization, green low-carbon, ecological environment, emergency safety, quality management, informatization, and other fields to nearly all industry customers worldwide, offering deep intellectual support for the sustainable development of enterprises and government organizations.
The company has obtained dual accreditation from the China National Accreditation Service for Conformity Assessment (CNAS) and the United Kingdom Accreditation Service (UKAS) in its main certification fields. It is a credit AAA-level enterprise and has been honored as a Model of Integrity in Beijing.
United Intelligence Industry has more than 1,000 full-time and part-time technical staff, with branches established in more than twenty central cities within the country and offices in several countries and regions abroad. It serves over 50,000 client organizations, has issued more than 100,000 certification certificates cumulatively, and its business performance has ranked among the top in the domestic industry for many consecutive years. Its efficient and high-quality services have earned widespread praise from clients and stakeholders.
 

Related Downloads

Related News

undefined

undefined