Lean Six Sigma Quality Transformation Toolkit (LSSQTT)* LSSQTT Tool #11 Courseware Content “Basic Measurement, Geometric Relationships, Broader Data-based Issues” 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Foundational metrology and measurement issues Accurate data, total quality systems, kaizen, lean, six sigma Metrology and inspection system services in quality Historical background on metrology Form, fit, finish and function, geometric underpinnings Foundational metrological and measurement issues Basic measurable features in geometric dimensioning Basic principles and devices for measurement and data collection Shifting toward the metric system Surface quality: focused foundational metrological issue *Updated fall, 2007 by John W. Sinn. Foundational Metrology Measurement Issues And Basic metrological principles and tools are important in the context of quality systems and data gathering. Important principles include basic measures and standards, precision and accuracy, and calibration. Several metrological tools are necessary to help implement these principles. These include gage blocks, surface plates, and other gages and direct measurement devices. Each of these will be briefly presented and discussed in this section. As well, selected other specific measuring information and devices will be pursued elsewhere in this tool, focused directly on metrology issues and circumstances. The foundational metrological and measurement issues are shown graphically nearby. Additionally, several basic measures are useful in industry to help give meaning, through standard values, for inspection. These include length, time, mass, temperature, electric current, light, and others. The levels of standards and values which are commonly used revolve around working levels, calibration, functional, reference and national/international standards. Precision and accuracy are functional interpretations of the standards in production. Precision is the closeness with which a measurement can be read directly from a measuring instrument. It is also considered as the smallest marked increment on the instrument. By contrast, accuracy is the measure of how close the reading is to the true size of the part. Calibration relates as the comparison of a measurement, standard or instrument of known/dependable accuracy with another standard or instrument to detect, correlate, report or eliminate by adjustment, any variation in the accuracy of the device being compared. Calibration systems generally involve procedures to enable a comparison of the instrument being calibrated with a standard having a higher degree of accuracy. Hand held measurement devices include rules, calipers, micrometers, dividers, small hole gages, and telescoping gages. Most of these are also direct measurement devices since they are read directly on the instrument. Indirect gages require comparison to a standard to get an actual final measure. Related to this, bench type inspection devices include the vernier height gage and dial indicators. Likewise, the use of light to view a product or component shape relative to desired shape as noted on a template, is what occurs with an optical comparator. This is staging a shape on the comparator and projecting it for comparison purposes onto the screen, relative to known values. More recently, surface plates and other traditional metrological tools are being used less and less in favor of coordinate measuring machines (CMM). The CMM is a device, based largely on traditional surface plate logic, having built in overhead measurement capability with the traditional values of surface plate technology as the basis. By using overhead measurement systems, tremendous precision, often lost in traditional hand setups, is gained. Through the use of high quality machine tool precision being built in, CMM's also take full advantage of computer numerical control (CNC) logic, further explained in later tools. It is important to recognize that high quality, precision, direct computer controlled measurements for the automated workplace will often be achieved through the use of the CMM. Even where automation is not the question, CMM technology may be the answer. The CMM can be programmed for various parts, and assuming an organization must inspect a part on a repeat basis, the program can be called up and recycled for significant gains in productivity--realizing there was a cost involved in writing the program. The direct computer control (DCC) feature of the CMM also enables reverse engineering, taking dimensions directly from the component being inspected. The reverse engineering system will also become increasingly essential in the emerging concurrent engineering environment of the future. This also relates to the need for evaluating our measurement systems to determine measurement error and other inconsistencies. The following procedures are intended to assist in the analysis of gage repeatability and reproducibility (R & R), sometimes called measurement analysis. This is particularly appropriate and helpful for analyzing shop floor (and other) gauges and instruments, and their operators, for usefulness in the broader inspection system. It is also true, however, that these R & R values will increasingly be required for documentation as suppliers and vendors--to be shipped with product and/or provided in other ways to demonstrate general capability in the inspection and quality system. Yet the fundamental reason for the R & R analysis is to assure as best we can that our gages are in fact accurate, and that our operators are using them correctly. Gage analysis would be a logical area of pursuit if we are seeing unexplainable variation in our X bar and R charts, or if we have eliminated virtually all other possibilities--the gages simply should be evaluated on a regular basis. This includes relational connections to maintenance, indicating that when the gages do not check out as repeatable it is likely they are needing repair or calibration, or both. Although eventually data collection may occur semi-automatically via computer terminals at workstations, currently most data will be manually measured at the gage and manually recorded on forms. This establishes the basis for the system, in conjunction with the gage and procedure for use-manually collected data. It is important that all operators and others understand the importance of taking good measures and recording this data accurately. Much analysis time will be spent--and decisions made on the basis of this data, and thus, it is vitally important that it be done carefully. Similarly, the forms are only as good as the people make them. Through use, and based on team inputs, the forms should continue to evolve toward a satisfactory collection, organization and analysis tool. Part of the point is, as we use the forms, as a key part of the system, we should be jotting down notes, talking to our supervisor about them, observing how others use them, and so on. Perhaps most important, we should be listening to what our customers are saying about the forms as well as the rest of the system. This includes people up and down the line, in quality functions, engineering personnel, other team members, and so on--all internal customers. This also, obviously, includes our external customers. Accurate Data, Total Quality Systems, Kaizen, Lean, Six Sigma Before proceeding any further, it would be well to begin/continue emphasizing the need to gather and handle data carefully and accurately. During collection and manipulation, such as organizing into columns or in forms, it is vitally important that we be mindful of the need to use extreme care and precision in our work. As we will see, increasingly, the data will formulate the basis of virtually all decisions in the workplace and throughout the organization. If we do not exercise extreme care and caution when measuring and recording, and perhaps as we calculate, we may be helping to put ourselves, and others, out of work. The sample we use as the basis upon which to draw all of our conclusions must be part of that carefully accomplished thinking for deriving accurate data about characteristics. How frequently to measure, when and where, how to measure, size of subgroup, who should do it, and what to do with the data, are all the beginnings of the right questions to ask about the quality system. This all must be done through careful interaction and team work with persons in quality, engineering, and throughout manufacturing--getting accurate sampled data is not easy--but it is vitally important since so much relies on this foundational data. All of this must be carefully thought through and designed with strong and careful communication from and with customers and suppliers, internal and external. Virtually all else that we will do, and certainly all other tools to be studied, are based on and around the basic statistical information given here at this time--with some expansion and elaboration. But it is very important that we understand what is happening here now--and beyond. It is also vitally important that we begin to take steps to integrate this into all that we do and all that we are about--from the point of the raw material being ordered to the finished product being shipped. Just as the basic SPC forms the basis for our technical future, the attitudes that we must build and engender for all of our people are vitally important--all of the tools in the world cannot make any difference if I do not want to learn them or use them. With statistical feedback provided, standards can either be altered or created. Clearly, however, based on quality data generated in production, standards can and should be derived or altered. Equally as useful, based on statistical feedback, control in processes can be pursued. Statistical analysis of the process can be pursued by gathering information which is documented/stored. If designed properly this can provide a data base which will prove invaluable for comparative and analytical purposes related to the processing functions--again enabling better decisions and ongoing improvement if we use the data properly. The basic relationships inherent in data, SPC, teams, documentation, and ongoing improvement are shown in a nearby graphic. Assuming the data base is built-up over time, much useful productivity information should be afforded. For example, if the rate of the task is speeded up, how many conditional units are produced and/or rejected? Thus, has productivity or quality actually been improved? Similarly, for training purposes, by observing results based on statistical feedback, employees can gain insights and understanding about when they have mastered a task, providing a sufficient quality level. But this all takes a disciplined, well trained, and "comfortable with numbers" work force--not quickly or easily achieved. We would be well advised to remember that it is desireable to have statistics which clearly provide documentation of quality levels as well as other general communication assistance. Statistical information can provide an excellent communication and documentation system for internal as well as external purposes. Increasingly, technological organizations are demanding clear records of the statistical process control report for a given lot or shipment. But as well, and an extremely important reason for using statistics, is the need to be able to communicate effectively and quickly with internal and external customers, upstream and downstream. If I do not use statistics properly, I increase the likelihood that customers will go somewhere else with their business. One of the key reasons for using statistics is to actually "know" what we are talking about. It is one thing to say "oh about half" versus saying 50%. Or if we say "well, quite a few of the products were defective", rather than saying "20% of the products were defective", it is different. Yet again, if we say "10 of the products had 3 defects each, yet none of them were actually defective", it can make a significant difference. The difference is that we are being more precise when we place a proper numerical indicator into the discussion. This is part of what will increasingly be demanded by our customers--and what we must demand of our suppliers. A broad-based model, forming a key part of the quality system, includes virtually all components being discussed in the toolkit, as shown in the graphic earlier. This is all part of the basis for the more robust systems based on data and documentation for improvement. This all relates to wanting to be better informed as we make improved decisions, rather than "seat of the pants" decisionmaking. Based on precision of communication, and of knowing better when we say something, what this actually means, is at the root of the need to use statistics and data. We must understand and use statistics to facilitate enhanced competitiveness, pivotal to our ability to improve the workplace and quality ongoing. The ability to stay healthy and competitive is directly related to our knowledge of statistics, and how we use the statistical tools to solve technical problems and work with one another. The actual inspection systems can be analyzed for kaizen, as with any other function. Development of lean systems where the actual process of inspection is imbedded in the other work as a value adding process is not a trivial matter and definitely can be improved in most cases. Knowing WIP in inspection, takt time, thruput and so on is worthy of pursuit and can likely lead to kaizen. Six sigma ties directly to gauging and inspection since we must have reliable data collected at the point of production in order to develop robust data for the basic improvement and problem solving system. Metrology And Inspection Services In Quality System Metrology is the science of measurement based on some known standard. Inspection is the comparison of existing materials or components to known standards or values. If we are going to build effective quality systems it is important to first determine quality characteristics upon which standards can be based, understand the relationship to customer and supplier needs and issues through vendor certification, know the metrological tools available and how to apply them, and finally build it all around reliability principles known to be effective. The following section provides further information and helps to put it all into a context appropriate for building the necessary quality system--focused on measurement and metrology. Measurement scales are of two types, nominal and interval. Nominal measurement scales have numbers assigned for the sole purpose of differentiating one object from another. This could include identification of lots and locations of product in various stages of production. Interval measurement scales are measurement systems for classification which includes an equality of units. This means that there are equal distances between observation points on the scale. Not only can we specify the direction of the difference but we can indicate the amount of the difference as well. This could refer to temperature scales, and any application with all the characteristics of the interval scale plus absolute zero capabilities enabling statements involving ratios of two observations, such as "twice as long" or "half as fast". The relationship of nominal and interval measures to the attribute and variable discussion presented in the previous tools is important. As related to the operator on the floor and the broader system, the nominal scale is a relative attribute measure such as a go no-go gage used for comparative purposes. The interval scale would be a variable measure which may or may not be a standard scale graduated in known units of measure. For purposes of moving the discussion forward it is suggested that the increments of measure in the interval devices and systems should be standard units, enabling articulation with other devices and systems for suppliers and customers. Several metrological principles and tools are important in the context of quality systems. Important principles include basic measures and standards, precision and accuracy, and calibration. Basic measures are useful in industry to help give meaning, through standard values, for inspection. These include length, time, mass, temperature, electric current, light, and others. The levels of standards and values which are commonly used revolve around working levels, calibration, functional, reference and national and international standards. Precision and accuracy are functional interpretations of the standards in production. Precision is the closeness with which a measurement can be read directly from a measuring instrument. It is also considered as the smallest marked increment on the instrument while accuracy is the measure of how close the reading is to the true size of the part. To facilitate quality systems, several general metrology and inspection considerations must be kept in mind. For example, people must be adequately trained to use measuring equipment. It is simply fruitless to proceed without adequate training. Concerning environmental issues, ambient temperatures in labs and plants must be controlled. Also an environmental issue, lighting in facilities must be adequate and appropriate for inspection. Of course, cleanliness is of significance in general but also dust/humidity through filtering/ventilation must be addressed. Effective inspection occurs in part through several systems' components, primarily emphasizing the operator. Establishment of quality standards must occur and be evaluated periodically. A system for inspection planning must be put together. How does this occur? What is the organizational structure that encourages good planning and interaction, both for inspection as well as other elements related to quality? Development and use of inspection instructions must be accounted for, as a part of production, certainly to include training for the same. The system, regardless of who does it, must include internal audits for quality and methods to handle reporting media, with documentation and feedback for improving the system? Once reported, how does analysis of data occur, and once analyzed, how does this information get used? Does the system account for good communication and feedback about what was found in inspection? The degree to which each element in the system is applied varies with the product, customer requirements, production phases, cost, ability of workers, time available, and perhaps other factors. However, the design and use of the elements and their system is a key factor contributing to top rate inspection functions and quality overall. If the quality system and overall production system, within a cultural environment of change, is dynamic and improving on-going, the quality of work life should be improving in very real ways. The workplace should be getting cleaner, certainly including air quality, climate, lighting, general housekeeping and so on. And along with all else, clear criteria for acceptance or rejection must be identified and communicated, as well as how decisions are made for shutting down production, who files what reports and to whom, and so on, all designed to help improve the system. These all suggest numerous key roles for technologists as fundamental services to be performed within the context of supplier and customer relationahips. Historical Background On Metrology Metrology, the science of measurement, became necessary around 6000 B.C. when prehistoric people discovered agriculture. To find the fertile soil needed to grow crops, early humans had to move away from caves and make their own shelter, thus beginning architecture. As the need for more sophisticated materials for shelter increased so did the need for metrology. Nearly three thousand years ago the cubit came into when hand fitting became necessary to put on roofs and to distinguish property boundary. The cubit, like many other units of measure, was based on the human body. Although not broken down into smaller increments until the Romans, the Egyptians were able to build a pyramid with 756 feet on each side to a height of 481 feet. As long as tools for building structures and other items were relatively crude, their exact size was of little consequence. When metal came into use it was discovered that size could be controlled to duplicate any tool that had proved effective. This continued the marriage between tools and metrology during biblical times and well beyond. Since various people (besides the Egyptians) were discovering the need for metrology, different units were developed. This created conflicts in property boundaries, weights of grains, etc., when nations were taken over by new governments. One of such occurrences was after the Romans left England and political authority was divided among the Saxon kings. Although primary interest of the kings was conquest, the "Law of Edward" evolved. This law was an accumulation of Anglo-Saxon creeds dating from after the birth of Christ. This was one of the earliest examples of government standardization and consistency. During the Renaissance, the mechanical clock, and early, crude, machines were built. In addition, the slide rule came into use, the micrometer was invented, and the British yard became the legal British standard until the 1800's. In 1718 another improvement in standardization was the so-called "yard stick", started for a permanent standard of length. The reason was that the British yard in metal form was relatively inaccessible to the individual craftsman where standardization is of the utmost importance. At this point, manufacturing was still at a stage in which each shop had its own individual standards. As improvements in metrology and machining slowly continued, James Watt decided to improve the steam engine. When he did so, Watt ran into the serious limitations of available cutting and machining tools. Watt, and others, up to this period would get around the lack of accuracy in machining through the use of crude hammers and forming tools. Watt could not make the steam engine sufficiently precise until accurate tools were evolved. It was not until the late 1700's that boring bars and other tools of sufficient ridgidity were available that a somewhat true cylinder could be bored. The steam engine represented one of the key turning points in metrology since precision and accuracy were combined with metallurgy and design to meet with successful conversion of energy resources into power. As demand for manufactured products increased consistent with the standard of living, Ely Whitney and others recognized the wastefulness of handicraft methods. When replacement parts were needed they had to be hand fitted. This required a considerable amount of time. If the hand fitting could be eliminated more time could be spent making products rather than fixing them. Whitney also recognized that if more than one worker were to contribute to a final product, the product of each must be controlled dimensionally. This required the use of tolerances, capable machines, and accurate, precise, repeatable measuring devices. Whitneys' idea(s) was proven in the late 1700's and are now know as the principle of interchangeability of parts, all pivotal for mass production. During the 1800's devices such as the surface plate, a micrometer that could measure to .001 inch, and gage blocks were developed. In addition, by 1850 most of the basic machine tools had been developed. These included the engine lathe, the index milling machine and the first screw machine made completely of metal. In the early 1900's mass production, as presently known, began and the car industry was started. In order to succeed in the car industry many leaders of industry needed to increase the capabilities of metrology and the machine tool. As the auto industry and WW II drove the need for accurate measuring devices, tools such as the dial indicator, the air gage, and the electric gage were developed. Along with these inventions developments in automatic tools and new ways of measuring quality were being tested. This included developments such as numerical control (NC) lathes and statistical quality control (SQC). In the 1960's and 1970's final inspection was found to be expensive in terms of one defect part making a whole assembly unacceptable. This began the idea of inspecting parts during and after their previous operation, sampling and process gauging. In- process gauging now plays an important role in reducing waste and improving quality. With increasing automation in manufacturing, the need for automated measurement is becoming much more apparent. Presently, developments in self-diagnostic machines and systems, machine vision, and lasers are playing an important role in automation. In addition coordinate measuring machines and microscopes are being used as speed in measuring becomes more critical and electronic components become smaller. In the future, more accurate, reliable and faster instruments will be used more frequently in computer integrated manufacturing systems. These instruments will be developed due to the increasing need for tighter tolerances in industries such as aerospace and automobile manufacturing. This is also driven by the need to reduce costs and enhance delivery schedules, consistent with increasing customer demands for improved quality. Form, Fit, Finish and Geometric Underpinnings Function, Limiting simply says the range of dimension, as with .003, but is not directional relating to the specification. Tolerances also relate to specification (or "spec") limits in various sections during the discussion of control charting. Allowances indicate contact/space between mating components. Clearance is free space allowed. Interference is when specific amounts of negative clearance is required, as in the case of press fits. Specific fits include the following: 1. 2. 3. 4. 5. Quality systems, and in particular metrology and inspection, must assure form, fit, finish and function. These will be further addressed in terms of surface quality considerations, tolerances and allowances, and dimensions and shape. It must be understood that this relates considerably to design and engineering functions, as well as production functions in the technological organization, not to mention function after a product is in service, or reliability. Surface quality considerations include roughness, and finer irregularities in surface texture, usually resulting from processing but not necessarily limited to processing. Surface quality could also be a function of corrosion or other physical impact beyond processing. Roughness height/width/waviness and other errors of form are all typical concerns when considering surface quality. Measuring surface quality is generally accomplished by moving a fine stylus or probe across the surface of the component being examined. However, a more precise method is interferometry. This is a quality/metrology method involving putting light on the objects' surface and measuring the interference in light waves. Surface quality is discussed in detail in a later part of this tool. When parts are designed to go together, some tolerance is typically allowed. Tolerance says how much deviation from standard can be allowed, above or below, positive or negative. These are generally given in bilateral, unilateral or limiting tolerances. Bilateral is where the part can vary + .003, but is + in either direction from the specification. Unilateral allows only deviation in one direction +.003 or -.002. Running/sliding--intended to rotate, slide. Locational clearance fits--stationary parts but freely assembled. Transition locational--accuracy of location is apparent, but some clearance or interference can be achieved. Locational interference--light pressure for assembly. Force or shrink--heavy pressing to assemble. These allowances also relate to specifying quality in the design function, and function in mechanical applications, all discussed in other parts of the toolkit. Base line dimensioning is useful in specification of quality characteristics. It is where all dimensions and measures are given from a common reference point. This eliminates tolerance build up and provides all measures to a common reference point. This is a spin off of computer numerical control, absolute programming, where all dimensions come from X, Y and Z coordinate intersection in machining and measuring operations. As will be noted later, this also ties into CAD-Math data, and new product development. Geometric dimensioning or tolerancing also relates here since all forms/shapes are treated similar to base line and geometric dimensioning. Several forms or types of measurement are of concern for metrological issues. These include perpendicularity, cylindricity, angularity, runout, straightness, flatness, roundness, squareness, parallelism, concentricity, and eccentricity. Each of these will be briefly explored and addressed in a later section. Foundational Metrological Measurement Issues And Several basic metrological principles and tools are important in the context of quality systems and data gathering. Important principles include basic measures and standards, precision and accuracy, and calibration. Several metrological tools are necessary to help implement these principles. These include gage blocks, surface plates, and other gages and direct measurement devices. Each of these will be briefly presented and discussed in this section. As well, selected other specific measuring information and devices will be pursued elsewhere in this tool, focused directly on metrology issues and circumstances. Several basic measures are useful in industry to help give meaning, through standard values, for inspection. These include length, time, mass, temperature, electric current, light, and others. Levels of standards and values which are commonly used revolve around working levels, calibration, functional, reference and national/international standards. These are further identified as follows: 1. 2. 3. 4. 5. Working level--used at the work center and on the shop floor, throughout the plant. Calibration standards--working level gages are calibrated here, often done out in the shop. Functional standards--used in metrology labs to calibrate company standards. Reference standards--certified to the US Bureau of Standards, used in lieu of national standards, done in the metrology lab. National and international standards--final authority to which all standards are traced. Related to this, and defined earlier, precision and accuracy are functional interpretations of the standards in production. Precision was indicated to be the closeness with which a measurement can be read directly from a measuring instrument, also considered as the smallest marked increment on the instrument. By contrast, accuracy was defined as the measure of how close the reading is to the true size of the part. Calibration is defined as the comparison of a measurement, standard or instrument of known/dependable accuracy with another standard or instrument to detect, correlate, report or eliminate by adjustment, any variation in the accuracy of the device being compared. Calibration systems generally involve procedures to enable a comparison of the instrument being calibrated with a standard having a higher degree of accuracy. This will be defined further in a later section. Gauge blocks are metal blocks, typically accurate to a millionth of an inch. They are useful in laboratories to serve as one of the most accurate references available. Other tools/gauges are calibrated from gage blocks. Gage blocks come in various grades and should be selected for the type of work to be accomplished. Gage blocks have two working surfaces which are flat, smooth and parallel, and they are made of special steel which combats changes in dimension and is hardened to avoid wear. A surface plate is a reference surface which is relatively flat, for basing measures upon. Typically made of cast iron or granite, they must possess (l) sufficient strength to support a test piece, or setup, with dimensional stability; and (2) sufficient accuracy for measurements required. Cast iron plates have the characteristics of greater strength to weight ratio, relative to granite. The cast iron plate is less likely to chip/fracture and it can be wrung without damaging the surface, whereas granite may gouge. Also the cast iron plates are magnetically useful in terms of special setups. By contrast, granite plates have the characteristics of being non-corrosive, and they do not burr or get craters. The granite plate has better flatness tolerances, better thermal stability, and they are non magnetic. Surface plate/gage block accessories for metrological inspection include a toolmakers flat, which provides a smaller measure and is similar to the surface plate, but more precise. Angle plates are used in perpendicular set-up and measurements, and also as an elevating device. Parallels are a support/elevating mechanism consisting of matched pairs with parallel working surfaces. Likewise, Vblocks are useful for locating/holding rounds while Sine bars and plates are used to measure angular setups. Several traditional types of gages are useful for inspection work. Among these are general fixed size, pneumatic and hand held (direct/indirect) gauges. General gages are useful for gauging large numbers of similar components in an efficient manner, particularly linear and angular dimensions, taper, roundness, concentricity, eccentricity, parallelness and contours. Fixed size gages are used for accept or reject situations, as in snap gages for outside diameters, and rings for rounds and shapes. Plugs are used to check internals (all are go-no-go types). Pneumatic gauges simply measure air leakage surrounding or within the gauge as it is in contact or close proximity with the component being inspected. These are easy to read and have no moving parts, yet they are sensitive to the surface of the product being studied. Hand held measurement devices include rules, calipers, micrometers, dividers, small hole gages, and telescoping gauges. Most of these are also direct measurement devices since they are read directly on the instrument. Indirect gages require comparison to a standard to get an actual final measure. Related to this, bench type inspection devices include the vernier height gage and dial indicators. Likewise, the use of light to view a product or component shape relative to desired shape as noted on a template, is what occurs with an optical comparator. This is staging a shape on the comparator and projecting it for comparison purposes onto the screen, relative to known values. Often, a vellum or clear plastic print will be overlaid on the screen for direct comparison of part to dimensions. More recently, surface plates and other traditional metrological tools are being used less and less in favor of coordinate measuring machines (CMM). The CMM is a device, based largely on traditional surface plate logic, having built in overhead measurement capability with the traditional values of surface plate technology as the basis. By using overhead measurement systems, tremendous precision, often lost in traditional hand setups, is gained. Through the use of high quality machine tool precision being built in, CMM's also take full advantage of computer numerical control (CNC) logic, further explained in a later tool, relating to automation. It is important to recognize that high quality, precision, direct computer controlled measurements for the automated workplace will often be achieved through the use of the CMM. Even where automation is not the question, CMM technology may be the answer. The CMM can be programmed for various parts, and assuming an organization must inspect a part on a repeat basis, the program can be called up and recycled for significant gains in productivity--realizing there was a cost involved in writing the program. The direct computer control (DCC) feature of the CMM also enables reverse engineering, taking dimensions directly from the component being inspected. Basic Measurable Features In Geometric Dimensioning Part of what the previous section leads into is the area of geometric dimensioning and tolerancing (GDT). GDT is a system of communicating in design and technical work, related to drafting and prints, now most often done as computer aided drafting (CAD). This also opens the door for CADmath data as one of the key numerical systems for communicating from design to production and beyond. More important within the current section and tool, GDT is a bridge between the design side and measurement, since by specifying various features and conditions in design, information for quality and measurement is also being defined for important characteristics in production. This section provides specific details, at an introductory level, relating measurement and design through GDT, with a focus on straightness, flatness, roundness, cylindricity, parallelism, perpendicularity, angularity, circular runout, and total runout. In all cases, examples are given similar to the way they will actually be used in design and quality "print takeoff" applications. Straightness. The simplistic definition of straightness relates to deviation from straightness as specified in a component, typically a flat or a cylinder. The straightness measure is typically referenced to another surface or an axis in the component. Three types of straightness are of concern, surface elements, axis and centerplane. Surface elements relate two surfaces one to the other, axis straightness indicates axial straightness relative to surfaces, and centerplane relates the centerplane and corresponding surfaces as a single unit. Each of the straightness measures are graphically described nearby: ∅ = .020 (a) ∅ (b) .510 .490 ∅ = .020 (c) .020 .510 .490 Straightness Feature. Example (a) is showing a surface straightness specification. (b) is straightness relative to an axis. Both are relative to a cylinder as provided by the diameter symbol and centerline. Example (c) is a centerplane on a flat. Measurement of the straightness characteristic would occur by rotating a cylinder on its axis and checking for variation along the length. In a flat, the deviation would be checked across the surface, first one side and then turned, and the other checked. In both cases the MMC allowable deviation in any single plane would be as specified in the box. Flatness. Another key measurable in metrology, either in the end product or in equipment required to do production, is flatness. Flatness is a geometric form which applies to a continuous surface, as related to waviness. It is important for various reasons, such as mating components' integrity, surface seal capacity, and overall fit and function. When it is specified, it is not within the context of a datum, but what is termed an optimum plane. The optimum plane is referenced within two imaginary lines as a tolerance zone, where the specified flatness surface must remain. Flatness cannot be gauged in the common go, no-go, sense of the term, but rather it must be measured. Flatness, while critical, is typically separated from accumulated size tolerances or associated measurement issues. The feature or characteristic is shown as a side view of a circular disc in the following specification: a c b Looking Down On A Disc. Assuming the part under discussion is a disc, looking downward, as in a top view, the three points where the part is suspended are represented by the points a, b and c. After leveling the suspended component relative to the measuring surface, multiple measures can be made across the entire component surface. Roundness. Roundness measurement is a form tolerance which applies to cylindrical parts. The tolerance zone for roundness is two concentric circles where the stated value provides the actual tolerance. Graphically the roundness measure would appear as follows: .001 .003 ∅ .500 Flatness Feature. What is being communicated is that the surface on the top plane must be maintained within a tolerance zone of .001, across the entire face. The parallelogram is the symbol for flatness, shown in the traditional GDT specification box. For measurement purposes, after the optimum plane is established, a full indicator measure must be made of the plane. This would typically be measured on a surface plate or coordinate measurement set up where three independent yet equivalent points are established. Adjustable jackstands would be adjusted, measured independently, and used as a three point support for the object being measured. With the part suspended atop the jackstands, and after the part has been leveled at the three jackstand points relative to the surface plate, the specified tolerance can be checked. .490 Roundness Feature. The graphic indicates that the diameter ( ∅ ) of the part must conform to a size dimension of maximum size .500 and minimum .490. The roundness (O) of the part must be held to a tolerance of .003. Significantly, it should be noted that the roundness is not contingent upon the diameter. While there is no required relationship, it is also true that the indirect relationship will frequently relate the two. The more stringent the diameter measure, the more likely the roundness measure is to be held. The actual measurement of roundness is compounded by the reality that roundness appears to be measurable by standard two point direct measurement equipment such as calipers and micrometers. But two point measures can be misleading due to any two point across from each other (180 degrees apart) may or may not be concentric or eccentric. This is also compounded by the eccentric/concentric issue being different at all points along the length of the cylinder. Effectively, the actual functional outside diameter of the component will be the actual next size up cylinder which could slip over the component. Moreover, the "lobe error" as it is called, will have more than only two lobe points at each "slice" of the diameter being measured. Typically, in non-critical parts or applications, the lobe error issue can be dealt with by taking the average of multiple diameter measures with two point measurement equipment. But for more critical part measures the method will require a surface plate set up, “V” blocks, and a dial indicator. The dial indicator used in the full indicator mode (FIM) can give a fairly robust measure of overall lobe behavior as defined in concentricity or eccentricity terms. This is depicted graphically nearby: a .003 ∅ .500 .490 Cylindricity Feature. The feature control box is specifying that the cylinder must have all surface points along the length within a stated tolerance zone of .003. The tolerance zone is specifying that all roundness measures, straightness of surface, and taper from end to end will lie within. The compounding features of cylindricity are graphically illustrated by the following geometric forms presented earlier with reference to roundness, now further expanded. c b Concentricity Or Eccentricity. Simplistically, circles ab and cb as separate entities, may be round at the two points indicated on each. Whether they are similar at the infinite other two point possibilities around the circumference potentials, or length if a shaft, will remain a question. Gathering additional points and averaging will help in gaining a more accurate measure. Placing a shafting on centers and rotating on axis, or rotating in a rotary table, can provide alternative methods for measurement relative to V blocks. Similarly, gathering multiple measures at each slice of the same length/diameter, and averaging can assist. Use of the CMM for the same functions, as with many applications, can reduce numbers of moving parts in the measure, and human interference, potentially reducing the overall error. Cylindricity. When the individual features of roundness, straightness and taper are combined into one measure, the more complex issue of cylindricity is addressed. Roundness represents, theoretically, only one slice along the diameter of the cylinder. By contrast cylindricity seeks to connect all of the slices along the surface of the cylinder, and compare both sides of the cylinder simultaneously. Graphically, the cylindricity measure is shown nearby: Taper As Part Of Cylindricity. Not only is taper notably an issue, but multiple tapers and other irregularities must be ascertained. As referenced in the previous graphic showing the tolerance zone of .003, the part showing multiple tapers would be measured to determine compliance within the specified zone. Measurement sophistication must match or exceed the complexity of the feature specification in precision and accuracy. The key difference between roundness measures previously discussed and the cylindricity features is that straightness and taper must now be determined and maintained. As the simple roundness measures are accomplished, now the question of other complexities must be checked by determining total cylinder geometry, sometimes referred to as total runout. In this case, as well as representing and averaging multiple roundness measures on the surface and constructing these into the geometrical form, straightness, taper and other elements must be represented. This could include elements such as concentricity and eccentricity. These could be established by running a dial indicator or CMM probe along the axis of the cylinder in multiple iterations. nearby: Probe This is illustrated Or Dial Cylinder Rotated On Centers. As the part is rotated, systematically, the indicator or probe is used to establish points on the surface. Points are consistently and incrementally established, recorded and averaged. As the more complex aggregate values are identified and plotted, the geometric form of the cylinder will be established for analysis. Parallelism. When surfaces or axes must be controlled relative one to the other for parallel conditions it is referred to as parallelism. At least one datum is called out as a plane or axis relative to another plane or surface. This may result in a tolerance zone between two planes with a datum as a plane. The tolerance zone could be the axis of two planes as a tolerance zone when a datum is a separate plane for an axis to surface parallel condition. Or the tolerance zone could be expressed as a cylinder where the parallel condition is the axis of the cylinder relative to another axis as the datum. In all cases the parallel specification must express a tolerance zone within which the parallel condition must fall, shown as two parallel slanted lines in the feature box. This is shown graphically nearby: .003 A The graphic indicates that feature A must be parallel to datum A. Thus, all surface points on A must fall within a tolerance zone of .003 relative to datum plane A. This type feature would generally be measured at a surface plate with a dial indicator or at a CMM. Various locations on the surface could be checked to determine that none fall outside the tolerance zone. Perpendicularity. Perpendicularity applies to conditions or features where orientation of parts nust be 90 degrees one to the other. The tolerance zone is represented by two parallel planes apart by the specified distance, and 90 degrees to the datum. This is communicated as providing a .003 tolerance zone for feature A relative to datum plane A in the drawing above. Once again, the basic measurement system would probably require a surface plate and dial indicator set up as a minimum. The preferred method would generally be a CMM to characterize the perpendicular feature on the outside dimension being specified. It should be noted that the feature, as identified, is only specifying the outside surface, and that if perpendicularity of the inside surface were desired this would need to be a separate feature. .003 A -A-APerpendicularity Feature. Parallel Feature. Angularity. Where angles must be specified in geometric form, the term is generally referred to as angularity. The angularity feature, while seemingly less complex than some other features, specifically cylindricity, does bring forward some interesting complexities in the GDT communication system. Since angularity requires a reference of the angle to some 90 degree datum, this introduces what is termed primary and secondary datums in the GDT system. This is illustrated nearby graphically: .003 A B 30 -A-B- geometry, sometimes referred to as total runout. The key differences in cylindricity and roundness, relative to runout, is that the cylindrical features are specified to a datum such as an axis or diameter. This is presented as circular runout and total runout. The less complex of the two is circular runout, and this will be presented first as part of the basis for total runout. In the case of circular runout, multiple roundness measures of the surface must be represented. These were discussed in measurement terms as running a dial indicator probe along the axis of the cylinder in multiple iterations, taking individual "slice" measures at various locations. This is illustrated nearby: FIM Dial Indicator Angularity Feature. The angle of the specified surface is 30 degrees relative to primary datum A. However the tolerance zone of .003 is aligned by secondary datum plane B, shown as a 90 degree reference. As a measurement task, sine plates or bars would be used in the traditional setup at a surface plate to achieve the specified angle. The sine plate is an adjustable angle device, providing the desired end result, while the sine bars would be set with gage blocks used to geometrically configure the appropriate angle in set up. The preferred method would be to use a CMM and characterize the angle with direct hard probe hits which are converted into an angularity measure with the algorythyms of the software of the machine. Use of the CMM would permit placing the part to be measured directly into the work envelope of the machine, and taking the hits for rather immediate conversion of data into angle measure. By contrast, the traditional set up would require configuring the 30 degree angle of the part to become parallel to the surface plate through the use of the sine plate or bar technology. Use of the CMM would not only be quicker, but the CMM would likely enhance the overall precision of the measurement exercise, reducing measurement error. Circular runout. Runout was introduced earlier within the context of cylindricity. As discussed earlier, as the simple roundness measures are accomplished, the question of other complexities must be checked by determining total cylinder Cylinder Rotated In V Blocks. As the part is rotated in v blocks, systematically, the indicator or probe is used to establish points on the surface. Circular measure points are consistently and incrementally established as high elements or slices of the cylinder. As the more complex aggregate values are identified and plotted, the geometric form of the cylinder will be established for analysis. Graphically, the circular runout measure is shown nearby: .003 A -A- Circular Runout. The feature control box is specifying that the cylinder must have all circular measured points along the surface within the stated tolerance zone of .003. The tolerance zone is specifying that all circular measures on the shoulder feature must fall within .003 relative to the datum surface A. After taking a minimum of two separate slice measures, preferably three or more, the tolerance compliance can be determined as runout from the datum. Significantly, the specified datum A requires use of v blocks to reference the actual total diametral characteristic, relative to use of axis on centers. By using the v block to rotate the component within, the actual circular runout will be read at the specified point on the shoulder. However, it must be remembered that the v block is measuring from only two surface points, those being the highest points on the surface from which they are referenced. Concern over use of v blocks applies to both circular as well as total runout, and the concern also helps explain why more sophisticated technologies such as CMM have been pursued. Total runout. Total runout relates to circular runout in much the same way as roundness relates to cylindricity. While roundness is one measured "slice", similar to circular runout, the more complex total runout relates all geometric considerations relative to the cylinder within the context of runout. Thus, the total runout feature is a composite control for rotating parts taking into consideration roundness, concentricity, straightness, taper, and part surface profile. As related to end surfaces in rotating parts total runout would be concerned with wobble, perpendicularity and flatness of surface. As a composite, the total runout is a critical feature for control of rotating components in reference to balance, vibration and the overall dynamic in operation. Similar to cylindricity, the total runout feature requires a FIM for depicting part control, giving the composite surface geometry and dimension. Converting the following fractions thousandths of inch decimal readings nets: 1/3 = .333 inch 2/3 = .666 inch 3/8 = .375 inch 1/8 = .125 inch 1/16 = .062 inch Foundational instruments and devices used in basic measurement to derive the measurements shown above are defined as rulers, vervier calipers, and digital gages. This is not exhaustive but should suffice as an introduction for most persons getting started in measurement instruments and devices. The ruler is a foundational device since the vernier scale which is essentially the ruler, is used as the basis for many other important measuring devices. This includes the vernier caliper and the micrometer, although these devices are changing rapidly to become digital or in the case of calipers, often dial instrument. A vernier scale is attached to an instrument so that it may slide in a path parallel to the line of measurement. The main scale is also parallel to the line of measurement. Both scales are mounted so that readings can be made with minimum parallax error, meaning that incorrect readings result because of misalignment in graduations on scales. Advantages of vernier scales include: 1. Basic Principles And Devices Measurement And Data Collection For Most of the data which will be increasingly important in the quality systems in the future will be determined, measured and recorded in thousandths as a decimal value. Technically this is written as .001 for one thousandth, and if in inches, we would say .001 inch. If this were in hundredths it would be written as .01 inch, and in tenths as .1. To convert from a fraction into decimal we divide the lower half of the fraction into the upper half. Thus, if the fraction 1/2 inch were provided, 2 divided into 1 equals .5, and if this is done in the thousandths level of accuracy we would say .500 inch. If the fraction 1/4 inch were provided, 4 divided into 1 equals .25, and again if done in the thousandths level of accuracy we would say .250 inch. into 2. 3. Amplification is achieved by design and is not dependent on moving parts. No interpolation is required. There is no theoretical limit to the scale range. Disadvantages of vernier scales include: 1. 2. 3. The vernier scale is located on the instrument used and is dependent on the instrument for accuracy. The reliability of readings depends sufficiently upon the observer relative to most instruments. The discrimination is limited to the person using the gage to a great extent. The advantages and disadvantages of vernier scales are shown graphically nearby. Related to the caliper, digital gages are the fastest developing type of high precision measurement and gauging. Their ability to provide two or more scales plus their relatively low cost explains their popularity. The concept to keep in mind, however, is that the digital instruments are only as accurate as their mechanical parts, regardless of how far the zeros in the readout extend. Most digital instruments are easy to use but some have features that may assist in different situations. The best way to learn about the general and added features of a digital instrument is to read the operators manual. Some general advantages and disadvantages follow. Advantages of digital instruments are identified as: 1. 2. 3. 4. Can be zeroed out at any point within their range. Reduce the visual error. Instruments can be tied into real time SPC. Remote operation. Disadvantages of digital instruments are identified as: 1. 2. Digital readout can be misguiding due to the digital accuracy not corresponding to the mechanical accuracy of the device. Less rugged relative to standard vernier instruments. The advantages and disadvantages of digital instruments are shown in summary form in a nearby graphic. It is important to note, relative to instruments and devices for measuring, using vernier and digital systems, that the applications are virtually unlimited. This can include rulers, dial indicators, calipers, micrometers, and others in terms of hand held instruments. But it can also include placement on machines and equipment such as hard gages, or dedicated systems for extracting variable measures for data gathering. This is important in the context of the previous tools on attribute data transitioning to variable for enhanced ability to know if components and parts are in compliance with specifications as characteristics. A distinction should also be made between destructive and nondestructive testing in inspection. Nondestructive testing (NDT) implies that some inspection of the item will be accomplished with inspection techniques that can find surface or internal defects without destroying the item. By contrast, the item will be rendered non-useful through destructive testing. For example, arc welds in a nuclear reactor receive 100% NDT inspection for defects because the cost of a failure here can be great. The primary used NDT techniques are the following: Liquid or dye penetrate testing, magnetic particle testing, radiographic inspection, neutron radiographic, ultrasonic testing, Eddy current testing, acoustic emission, thermal inspection, optical holography, among others. Shifting Toward The Metric System One of the key areas of development in basic measurement is the shift which is occurring toward metric as the standard. This has been shifting gradually worldwide, to the extent that most of the rest of the world is metric while the US remains an English system. Over the next several years there will be a continued effort to move the US into the metric measurement system, lead primarily by organizations doing global business. The bottom line is, we need to have everyone on the same sheet of music, certainly including the units of measure, as well as instruments, data collection and documentation. Advantages of the system. By way of introduction, it should be pointed out that there are several advantages in the metric system. This includes the fundamental reality that there is one basic unit for each quantity which directly and logically relates to all other units. Decimals are used exclusively, with no fractions, providing a more precise and accurate basis for measurement in all applications. The inherent gains in use of the metric system apply directly to our need to be able to communicate with the least ambiguity and confusion in logical mathematical ways, how production is occurring. This is aided by elimination of long rows of zeros, simple and absolute symbols, and the system being comprehensive in the sense that all measures are covered under one umbrella. Other advantages relate to the need for shifts in ISO and QS standards as all parties move toward solid global supplier and customer systems for producing product. Getting a handle on the meter. By way of relative comparison, one millimeter (mm) is about the thickness of a dime. Ten mm are roughly equal to the size of the small fingernail, or one centimeter (cm). Ten cm are equal to one meter, a meter being slightly longer than one yard. A 100 mm cigarette is just under four inches, 35 mm film is about an inch and a half, and a standard door is roughly two meters in height. One thousand meters is called a kilometer (km). The basis for all measures in the system is the meter, with units of 10 being the fundamental building blocks in the system. This provides a nomenclature system as follows: meter = m millimeter = mm centimeter = cm decimeter = dm kilometer = km 10 mm = 1 cm 1000 mm = 100 cm = 1 m 1000 m = 1 km 10 dm = 1 m If we were to translate this into decimals, as is the case for precision purposes, as converted from typical fractions in the English system, they might appear to be similar, at a quick glance, all based on the millimeter and inch. The example in this case provides increments of 1/8 inch, or .1250 decimally, converted to millimeters: Fraction Decimal 1/8 .1250 1/4 .2500 3/8 .3750 1/2 .5000 5/8 .6250 3/4 .7500 7/8 .8750 1 inch = 1.000 = mm 3.175 6.350 9.525 12.700 15.875 19.050 22.225 25.400 While there are similarities, and the English system appears quite satisfactory, problems (or opportunities) can be noted in longer distances where we must resort to feet and inch conversions in illogical increments, yards and miles even less logical, and so on. The point is that the metric system, with its 10 placed logic system, is much more logically designed, particularly advantageous for precision and accuracy in measurement. Surface Quality: Focused Foundational Metrological Issue In 1962, representatives of the United States of America Standards Institutions, the Canadian Standards Association, and the British Standards Institution signed an American, British, and Canadian declaration of accord in standards for surface texture. The standards set up criteria for the standardization of tracer-point measuring instruments and roughness samples. The term "Machine Finish" refers to the geometric irregularities produced on the surface of a solid material by the cutting action of a tool, by abrasives, or by other finishing devices. In fact, designers put some explanatory notes such as "Rough Grind," "Lap," "Smooth Grind," etc., on their drawings in order to secure the finish they want. Actually, they are specifying a method of manufacture rather than a surface finish. Each type of tool or machining operation leaves its own individual markings. For example, a shaper with around-nose cutting tool produces long, regularly spaced U-shaped furrows having sharp ridges: an end mill makes a curved pattern with half the curves extending in one direction and half in the other. Transverse markings (those across the direction of tool motion) are the result of the profile shape of the tool and the rate of feed. Longitudinal markings (those in the direction of tool motion) result from the irregular cutting action of the tool during the process of chip removal, changes in speed or in the condition of the tool, or small variations in the uniformity of the material being worked. Fundamental theory of measurement. A variety of instruments are available for measuring surface roughness and surface profiles. The majority of these devices employ a diamond stylus which is moved at a constant rate across the surface, perpendicular to the lay pattern. The rise and fall of the stylus is detected electronically (often using a LVDT device), amplified and recorded on a strip chart, or processed electronically to produce AA or rms readings. The unit containing the stylus and driving motor may be hand-held or supported in the work piece or other supporting surface. The resolution of these devices is determined by the radius of diameter of the tip of the stylus. When the magnitude of the geometric features begins to approach the magnitude of the tip of the stylus, great caution should be used in interpreting the output from these devices. Profilometer. The Profilometer is recognized as one of the best, if not the first, practical tracer-type electronic instrument for measuring surface roughness. It is a mechanical-electronic shop instrument for measuring directly the average height of surface roughness, in millionths of an inch, of all types of surfaces produced by machining, grinding and finishing operations. It is composed of three principal parts. This includes the tracer, the amplimeter, and suitable piloting means. The tracer consists of an electromagnetic reproducer made up of a moving coil, to which is attached a diamond tracing point, and a stationary permanent magnet. The diamond point has a 90 degree included angle and a .0005” radius hemispherical tip. It moves up and down as it is guided over the roughness irregularities of the surface being measured. The motion of the coil in the magnetic field generates a voltage that is proportional to the roughness irregularities since the tracer is constrained to move in a direction perpendicular, or vertical, to the nominal surface being measured. The amplimeter, an electronic amplifier with micro inch instrumentation and controls, amplifies the voltage produced by the tracer, and activates the meter. Since the meter may be of the averaging type, (approximately two-second period) the height of the single scratches is not obtained, but rather the reading indicating the average height of the irregularities of the surface being traced. Suitable piloting must be provided to move the tracer over the work surfaces. This is often done manually on both internal and external surfaces, using tracers with suitable skids. However, it is often automated or mechanized as is the case in the Surfcom product by Brown and Sharpe. Limitations of tracer point measurement. It is not practical to form the point so that it would be perfectly sharp and slender enough to reach into every scratch, no matter how small. Such an extremely fine point would tend to catch on the irregularities of the surface and either deform them or break off (the probe point). Obviously, it is also possible that the actual data being generated will be affected by the size and shape of the stylus. Another factor which limits the accuracy of tracer point analyzers is the skid, which supports the tracer diamond. Both the size and the position of the skid directly affect the reading attained, because the motion of the diamond in relation to the skid rather than the actual movement of the tracer point with respect to the surface is measured. Surface quality terminology. As the surface of all materials finished by milling, turning, grinding, and other operations is composed of myriad tiny irregularities, certain terms descriptive of those irregularities should be thoroughly understood. This includes roughness, flaws, roughness width cutoff, waviness and lay. Each is further defined below: 1. 2. 3. Roughness, the factor controlled by the use of the micro inch surface finish designation, is a recurrent irregularity typical of the surface. It refers to all those deviations from a nominal surface which are characteristic of the surface, regardless of the crest-to-crest distance of the irregularities. Flaws, such as scratches, dents, and other local damage, are irregularities or imperfections which occur only at a few places in a piece of material; they are not typical of the surface as they are usually caused by conditions other than the normal machining process. Roughness width cutoff refers to the ability of an instrument to differentiate between fine pitch and coarse pitch roughness irregularities 4. 5. hence permits a better description of a surface to be made when necessary. It is a matter of the adjustment of the sensitivity of the instrument in gauging the irregularities of the surface rather than a characteristic of the surface being assessed. Waviness is surface irregularity of greater spacing than occurs in roughness. It may be the result of warping, vibration, or the work deflecting during machining. Lay is the term used to refer to the direction of the predominant tool marks, grain, or pattern of the surface roughness. Related to several of the terms, it should be noted that all surface roughness measurements or comparisons are normally taken across the lay. It is this direction which gives the best comparative value, and the highest reading on tracer point analyzers. An example setup for surface quality. This provides one approach to deriving surface quality data in a typical manufacturing environment. The principles and information can be applied to many applications, and it is intended to be start-up in nature. The basis of this setup is the SURFCOM profilometer by Brown and Sharpe, although other equipment could be used. The SURFCOM includes the following: 1. 2. 3. 4. 5. 5. Tracing driver and cord, hookups. Measuring stands and pickup. Amplifier/recorder and power supply setup. Appropriate software for calculating descriptive statistics, printer and other support setup. User's guide to the SURFCOM. Specimens might be 3µs, 6µs, and 1µs. Typical procedures would include a SOP, such as is shown here. It should be recognized that this is summarized, and should be studied further in actual user manuals for specific equipment. Procedures should be considered adequate to get started. The first set of steps (1-14) are oriented primarily toward calibration, while the remainder of the steps (15-21) deal with actual data acquisition. 1. 2. 3. Read appropriate chapter’s of the user’s guide to become further acquainted with surface texture terminology and the SURFCOM equipment being used. Calibrate the SURFCOM machine following proper instructions. Also refer to appropriate sections of the user’s guide. Push the power switch located on the far left hand side of the amplifier/recorder vertically 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. upwards to turn the equipment "on". Wait at least 15 minutes for equipment warm up before proceeding. Set the vertical magnification to 2K. Press the buttons marked (<) and (>) located in the upper right hand corner of the amplifier/recorder. Press the yellow preset button so that the yellow LED (light) for the row containing the label "Length" lights up. Also. the green LED labeled "in" on the right side of the display should light up. The yellow preset or data button should display a value of 0.16. This is the traversing length of the pickup stylus on the specimen. Set the "sens" switch to 1 . Press the yellow button labeled "Average" so that its LED light is off. Press the green output button so that the green LED at the top row of the outputs is on. Press the parameter select button below the ra label. The green LED on this button should be on. The display is now ready to output Ra values. Place the calibration specimen under the pickup on the worktable. Ensure the pickup axis is parallel to the specimen surface. If not parallel, reset it by re-tightening the skid-skidless knob. Ensure that the stylus tip is vertical to the work piece surface. Caution: Care should be taken with lowering the pickup. If lowered in excess of the CAL adjuster, the red light of the CAL adjuster will glow and may cause damage to the pickup and stylus tip. Lower the pickup carefully using the tracing driver’s elevating block until the yellow light on the zero pointer is set to the middlemost position. Note: The CAl adjuster should not be changed while measuring is in progress. Press the "meas" button on the amplifierrecorder to start the measuring process. This should be done once the pickup has made contact with the specimen and the zero pointer is set at the middlemost position. The display should show a Ra (average surface roughness) value of 118 µ in (micro inches). While the measuring is taking place, check to see that the zero indicator does not exceed the calculation range for the entire traversal length (neither of the two red lights at the extreme ends should glow). Remove the calibration specimen from under the pickup. Actual measurement would occur as apart of the next several steps, all based on proper calibration. The example being discussed is designed around Xbar and R charting, and gage R and R, as well as capability analysis. 15. 16. 17. 18. 19. (a) Place a face of the specimen under the pickup. (b) Adjust the pickup height and retighten the pickup clamp knob. Ensure that the stylus tip is vertical to the work piece surface and properly positioned in general. (c) Ensure that the pickup axis is parallel to the work piece surface. If it is not parallel, set it properly by re-tightening the skid-skidless selection knob and making other adjustments. (d) Carefully lower the pickup using the tracing driver’s elevating block until one of the yellow lights in the zero indicator which is labeled CAL. This is also called the CAL adjust, to be illuminated (care should be taken while lowering, if lowered in excess the red light at the extreme end of the CAL adjuster glows) by gently turning the CAL adjuster (a rotatable knob) to illuminate the middle most yellow light of the zero adjuster. (e) Check the status of the "auto return" button. If it is required that the stylus should automatically return to its original position press the button so that the display reads "on". (f) Press the "meas" button on the tracing Driver or the measure button on the front panel of the amplifier-recorder, to start the measuring process. At the end of the process the display will show the Ra (average surface roughness) value over the traversing length. (g) While measuring is progressing, check to see that the zero indicator does not exceed calculation range for the entire traversal length (Neither of the two red lights at the extreme ends should glow). (h) The CAL adjuster should not be changed while measuring is in progress. (i) Record the Ra value on the printed forms. This form contains the results. Repeat the above procedure at four more different points on the same side of the specimen. Using another specimen, repeat the above. Create a printout of all data collected. Perform and print all basic SPC calculations including X-bar and R charts, Cpk, R and R and print all as well as histograms and charts, by following the procedures listed in the software package. To determine R and R follow instructions in the toolkit. Provide an analysis and interpretation of information identified in previous items. 20. 21. Depending on level of analysis desired, perform a comparison between the two (or more) sets of data using ANOVA, looking for significant differences in variation. This should be done using appropriate software. Similarly, the data could also be used to perform Design of Experiments (DOE) using appropriate software to further analyze the data. This example is further explored in several subsequent tools within the toolkit. Questions, analysis, other issues. Several questions should be raised to assist the reader further focus on the issue related to surface quality, data collection, inspection and analysis in general. These relate to most of the topics previously presented and discussed, but in most cases, are only the beginning. While the profilometer is the basis for traditional surface quality measures, uses of lasers and other light interference systems will be the more likely approaches for the future. This is true since the profilometer, while reasonably accurate, is slow and time consuming. It is also vulnerable to deviations and mis-interpretations due to operator error, types of surface generation equipment, and other factors and elements. Yet, the above presentation should help the reader begin to understand elements of the relationships inherent in surface quality, reliability and functioning of product and other general relationships. It should also serve to assist the reader in beginning to relate the necessity of properly calibrated equipment, as well as care and knowledge in collection and acquisition of the actual data so vital to the improvement process.
© Copyright 2026 Paperzz