How precise are measuring instruments?

Topics include, Machine Tools & Tooling, Precision Measuring, Materials and their Properties, Electrical discussions related to machine tools, setups, fixtures and jigs and other general discussion related to amateur machining.

Moderators: GlennW, Harold_V

User avatar
GlennW
Posts: 7284
Joined: Sun Apr 15, 2007 9:23 am
Location: Florida

Re: How precise are measuring instruments?

Post by GlennW »

pete wrote: Sun Dec 26, 2021 4:08 amThere's another way of checking the accuracy of a non adjustable solid machinist square that's fast and dirt simple if you have a mill, a good trustworthy milling vise with a straight properly ground fixed jaw and a 10ths capable dti. Zero the mill vise to the tables X axis as normal. ZERO/ZERO to 10ths on the fixed jaw if you can manage it. Run the Y axis all the way towards you, set the squares short fat end against the mill vise's fixed jaw, and it's thinner long blade pointing towards you. GENTLY close the vise just enough to hold that short leg of the square against that fixed jaw. Use a magnetic base fastened to somewhere else on the mill that's not on the X or Y axis. Zero the indicator tip against the edge of the square leg that's pointing towards you at the end closest to the column, lock the table X axis, then run the table Y axis in so the indicator tip runs along the squares blade. Any number more or less than zero write it down. Loosen the vise and flip the square 180 degrees to the OTHER side of the vise and repeat the exact same test. Any difference in the two numbers is the inaccuracy present on that blade edge of the square. With that you can then use a mike to measure how parallel the blade is to the edge you just measured. If the square is in fact square and the measurements show the blades being parallel, then the inside is true to the outside that you've checked. You could also check the inside edge against the outside with that mill vise and a dti if you set the square in the vise to have enough room to get the ball tip of the dti on both edges and run the same test for each edge.
As long as the mill is in very good condition and the gibs are not worn and properly adjusted, or you may end up checking the inaccuracy of the mill's Y axis...
Glenn

Operating machines is perfectly safe......until you forget how dangerous it really is!
John Hasler
Posts: 1852
Joined: Tue Dec 06, 2016 4:05 pm
Location: Elmwood, Wisconsin

Re: How precise are measuring instruments?

Post by John Hasler »

Mr Ron wrote: Sun Dec 26, 2021 11:40 pm Thank you Pete for your explanation. Are you saying that if I get a reading of .355", that the piece could actually be .354 or .356.
With a recently calibrated high quality instrument (I wish I had such a thing) it means that the piece is between .354 and .356 .

The rule of thumb I learned (I don't recall where) is that to reliably produce end results accurate to +-.001 you must work to +-.0001. Of course you can analyze the process and calculate the accuracy required for each operation or component. I've done this for electronics but never for machining.
SteveM
Posts: 7763
Joined: Mon Jun 27, 2005 6:18 pm
Location: Wisconsin

Re: How precise are measuring instruments?

Post by SteveM »

mcostello wrote: Sun Dec 26, 2021 7:43 pm Mark it with chalk,
Cut it with a torch,
Grind to fit,
Paint to hide.
My brother is a carpenter.

He says "caulk hides a multitude of sins".

Steve
pete
Posts: 2518
Joined: Tue Feb 10, 2009 6:04 am

Re: How precise are measuring instruments?

Post by pete »

Actually Glenn pretty much none of that matters. Your using a single slide to repeat the exact same travel path twice. Any deviation in the numbers between check one and two is the deviation in your square. If your mill has a seriously worn Y axis, then yes I'd snug up the gib a little tighter than normal to take any slack out that might affect the numbers. Just checking by not flipping the square 180 degrees would be testing that Y axis squareness to what the unknown as yet for accuracy your square say's it is, flipping it over and comparing the two sets of numbers between each test will be the deviation that the square might have.

Compared to something like Moore, SIP, B & S etc jig borers, I'd be pretty sure most of us with a BP type or any mill will see some measurable amount in there exact squareness between the X,Y axis, but even that doesn't matter. Your comparing one test against the second and against a known reference surface (that fixed jaw) The Y axis could literally point 20 degrees away from being square to the X axis and that angle will still be the same for each test check. What is important is there's actually a flat, smooth and parallel surface on that vise jaw or it will make any checks like this useless. The dti numbers can go up or down along the blade of the square if the Y axis is out, but with a square that is 100% true those deflection numbers will show the same on each side of the vise. When or if they don't, the difference between the two test checks will be shown as double the amount the square is actually out so it's a pretty sensitive test.

The ALLOWABLE deviation in metrology equipment is + - one count Ron. I take that to mean the instrument can either measure on that .001" reading mike .001" above or below the exact size and still pass those QC checks. I think the better manufacturers try to shoot for the middle of the allowable and that seems to be confirmed with just about everything I own that I've checked. And on my equipment, any deviation seems to be either on the plus or minus side consistently and not a combination of both. And John's 100% right. As he said, it's an accepted rule of thumb in industry today to use equipment with a resolution and accuracy at least 10 X the level your trying to measure to. But industry is a whole lot different than what most of us are doing in our own shops. Were not trying to have parts pass inspection with a secondary measurement performed somewhere else. Most of what we need is that all our instruments should measure the same to a close tolerance level. Where it might be important for us is when fitting an off the shelf part such as maybe a bearing race to a bore were machining. Even then you'd measure the bearing O.D. with your mike and use that same tool to then measure something like your telescoping gauges while boring the bearing recess. So what were really doing is comparative measurements and not known exact measurements as with certified metrology equipment. I can guarantee I couldn't do it, but I've read more than once good experienced machinists could fit even press fit bearing races with nothing more than a set of old school inside and outside calipers that can't really measure anything on there own. Their own experience and sense of touch were filling in for what we now expect much higher cost instruments to do.

I have a book first written in 1907 titled Tool & Gage Work, they show master gauges being made on a sleeve bearing lathes face plate using a good micrometer, a set of shop made hardened and ground tool maker buttons and a shop made dti capable of showing under .00005" deviation and then hand tapping the parts into location for drilling, boring and then grinding to less than that 50 millionths. This was before jig borers and grinders were even invented so they did what they could on what they had available at the time. I've got equipment I could maybe measure work like that, but there's not a hope either myself or my current lathe could do the same as what they considered possible.
User avatar
GlennW
Posts: 7284
Joined: Sun Apr 15, 2007 9:23 am
Location: Florida

Re: How precise are measuring instruments?

Post by GlennW »

I'm referring to the slop in the ways/gibs, not the perpendicularity of the axis movement.

Mill ways wear in the center of travel. Tight at the travel limits and loose in the center.

I'm looking at checking a machinists square for error in tenths, and your method would be good for looking for something greater than that . Asking the average mill table to repeat to a few tenths in 12" or more of travel isn't something I would depend on.

The square in my link is also about 10" long, so you would need better than 20" of table travel to check it on a mill. I wouldn't count on my Bridgeport to repeat to tenths in that distance.
Glenn

Operating machines is perfectly safe......until you forget how dangerous it really is!
pete
Posts: 2518
Joined: Tue Feb 10, 2009 6:04 am

Re: How precise are measuring instruments?

Post by pete »

Yep I'd certainly agree that anything more than maybe a 6" square would be pushing it for sure Glenn. My mill has 22" of X axis travel so a 20" long square could be checked by bolting the vise down with it's fixed jaw parallel to the Y axis and flipping it in the same way. But that's almost pointless since any mill in even brand new condition isn't nearly accurate enough over those lengths. Then it becomes a real tough to figure out how it could be done accurately enough. Any method I can think of right now starts looking like it could get pretty complex and time consuming.
atunguyd
Posts: 199
Joined: Tue Oct 20, 2009 9:39 pm
Location: Durban South Africa

Re: How precise are measuring instruments?

Post by atunguyd »

rmac wrote:
atunguyd wrote: Sat Dec 25, 2021 11:29 pm But if my ruler says that the 10cm block is 10mm then it is accurate.
If your ruler says that the 10 cm block is 10 mm then you have a problem. :D
Oops [emoji23]
But I am sure you got the point.

Sent from my SM-N975F using Tapatalk

earlgo
Posts: 1794
Joined: Sat Jan 29, 2011 11:38 am
Location: NE Ohio

Re: How precise are measuring instruments?

Post by earlgo »

For accuracy may I remind you of these handy gadgets?
Gage Block 950.jpg
--earlgo
Found this at a used tool shop and it cost all of a couple of bucks. Thought I'd save some history.
Before you do anything, you must do something else first. - Washington's principle.
SteveM
Posts: 7763
Joined: Mon Jun 27, 2005 6:18 pm
Location: Wisconsin

Re: How precise are measuring instruments?

Post by SteveM »

earlgo wrote: Tue Dec 28, 2021 10:08 am For accuracy may I remind you of these handy gadgets? Gage Block 950.jpg
--earlgo
Found this at a used tool shop and it cost all of a couple of bucks. Thought I'd save some history.
I was at an estate sale and the guy had a full set of Ford Johansson blocks. I had never seen an actual set. I think they dated to the Model A era (1930's).

Steve
thunderskunk
Posts: 109
Joined: Mon Jul 31, 2017 8:24 pm
Location: Vermont

Re: How precise are measuring instruments?

Post by thunderskunk »

Mr Ron wrote:The question has come up as to how square is a table saw blade to the table. If you want the blade to be exactly 90°square to the table, you would use a machinist's square and check for any light (gap) between the square and the blade. The question is; how precise is a machinist's square? Is there a ± value on a square? Can one say a square is 90° ± .xxx°. Precision tool manufacturers don't seem to publish accuracy. The same is also asked about length dimensions. Is a micrometer that measures to a thousand actually .000 or is it ± tenths?
Someone already said it, but I’ll chime in. Accuracy is the end of the error train. It starts with design intent, which is translated into your 1.000” +/-.001” dimension. Let’s call it a distance and say it’s between two planes. We can’t ignore GD&T because rule #1 is “Size limits control surface form.” It’s like fight club, but rule #1 says you MUST talk about fight club.

So these two planer surfaces have zones in which all points on each surface must lie. It’s important because how perpendicular your micrometers are to the surfaces influences the final measurement, and you typically square an anvils to a surface. If the design intent of the surfaces being measured is for the object to fit through a space, having cock-eyed micrometer anvils will tell you your part is too big when it might be just fine, or vise versa if you land in a low-spot. So we call one surface a datum and assume it’s perfect. Because no surface is perfect, we get some error here.

The anvils on the micrometers will be cut to many times the precision the tool is expected to measure. Very flat, very parallel. Still not perfect, but the hope is the error is a fraction of the tolerance being inspected to. Thermal expansion adds some more error, and so does pressure applied to the screw per operator. Resolution and parallax fight each other a bit: the lower the resolution, the easier it is to “offset” bad parallax, but at the loss of decimal places. Assuming we have a high resolution and we’re using a vernier on the dial, parallax becomes more of a problem.

So if I measure the part twice and get the same answer, I say the gage is repeatable. If you use the same gage on the same part and get the same answers as me, the gage is reproducible. Do that with 9 more parts and we call that study a GRR, which is dependent on the tolerance being studied. The tighter the tolerance, the more confident we want to be that the variation in our measurement system is much less than said tolerance, so that variation is typically referenced as a percentage of the tolerance.

https://uploads.tapatalk-cdn.com/202112 ... 572d86.jpg


All that error combined gives you your precision “cloud.” The center of that cloud compared to the real value you’ve measured is accuracy. Unfortunately that “real value” has to have precision attached to it as well.

All that to say a machinist square must play the same game: the perpendicularity of one leg to the other must be several times the precision of whatever you are measuring to get a good reading. With a square, you form a perfect plane between two touch points on one leg, then butt the other leg against the perpendicular surface. If you want to report the angle, you have to form two lines and compare them in a constrained plane that, again, meets design intent. With a machinist square, it’s only possible to do this on the outside. If you wanted an average or something based on low spots, that a different functional gage, or just a CMM program.

So if someone sold you a square and said it was 90 degrees +/-.01 degrees… that doesn’t guarantee a heck of a lot unless it was measured to make contact in the same spots you intent the tool to make contact. The right way is to report the flatness of one end, call it a datum, then report the flatness of the other leg and the perpendicularity of that surface to your datum.

I might also be blowing smoke. I’m still learning this stuff, but I’ve got some good mentors.
"We'll cross that bridge once we realize nobody ever built one."
thunderskunk
Posts: 109
Joined: Mon Jul 31, 2017 8:24 pm
Location: Vermont

Re: How precise are measuring instruments?

Post by thunderskunk »

rmac wrote:I need help with a couple of your acronyms:
thunderskunk wrote: We can?t ignore GD&T because ...
What's GD&T?
thunderskunk wrote: we call that study a GRR
And what's GRR?

-- Russell Mac
GDT= Geometric Dimensioning and Tolerancing. It’s the “design intent” way to dimension a part. There’s a whole standard (ASME Y14.5) that says what means what, which is where rule 1 comes from.

If what I really care about with a bolt pattern on a plate is that it fits with another plate using bolts, linear dimensions and hole diameters don’t tell that story. It likely over-tolerances the part, which technically makes it more expensive to make. Using feature control frames, you can gain “bonus tolerances” like: if your hole diameter is bigger, you gain more tolerance on the distance between holes because the bolt has more clearance to fit between the two plates.

It can be complex, yes, but once both the design engineer and machinist have a grasp of it, it’s better for both parties.

GRR= Gage Repeatability and Reproducibility study. The result is the % error of a given gage, so if I need to measure 1.000 +/-.001, but the error in my calipers is 40% of the tolerance (a total of .002, so .0008” variation). That means a reading of 1.0005” could actually be 1.0013” which is scrap, but you wouldn’t know it. To be sure the part is in tolerance, it would have to read 1.0000”.
"We'll cross that bridge once we realize nobody ever built one."
User avatar
rmac
Posts: 787
Joined: Tue Jun 26, 2012 12:48 am
Location: Phoenix, Arizona

Re: How precise are measuring instruments?

Post by rmac »

Thanks. That helps.
Post Reply