Lathe alignment
- GadgetBuilder
- Posts: 139
- Joined: Thu Apr 12, 2007 2:34 pm
- Location: Newtown, CT
Since I hadn't used the cantilever formula previously I was concerned that I could be off by a lot and not realize it. To verify that the calculated result is not erroneous I chucked the original 1/2" test rod, set the DTI to read vertically and added a 1.5 pound weight to the end of the 10" rod; deflection was about 3 thou. I'm more comfortable with my calculated result of 7 tenths now that I have some experimental results that are in the same ballpark. The 1/2" test rod is 19" long and weighs about 1 pound.
The directions on my site for RDM indicate that the DTI should be centered on the test rod. Assuming this is done, then when measuring horizontally for RDM I calculate the horizontal circular error for sag of the 1/2" rod to be:
1-cos(sag/radius) = 1-cos(0.7/250) = 1-0.999999998 =0.00000002
To get the actual horizontal error, multiply this by the radius, 1/4"
For vertical RDM measurement the sag error would be 0.0007 inch
Where both of the above errors are over a distance of 10"
When aligning the headstock, I believe the horizontal error due to sag can be ignored. The vertical error due to sag of the test bar will cause the headstock to be corrected to point up slightly to compensate. However, this will not affect taper of work produced on the lathe because the cosine function now works in the other plane so taper due to the resulting headstock vertical angle should be unmeasurable. I assume there are some uses for the lathe where this level of vertical error would be a problem but I haven't run into them (yet).
My conclusion, assuming the above calculations are correct, is that RDM produces an error because of sag in a 1/2" test bar but this error is of no practical consequence. This may be why, as you noted, this error source is not mentioned in descriptions of RDM.
Chucks typically hold work at a very slight angle to the jaws so RDM accommodates this chucking angle in its measurement technique and calculations, allowing RDM to work despite this error source. I assumed it was difficult to accommodate chucking error when using a flat test bar; a reference to a description of alignment with a flat bar would be appreciated.
It is possible I've erred in my calculations. Since RDM is suggested as a practical and reliable method on my site I welcome input that will help me understand the method better and so ensure that the information there is correct. By the same token, if RDM is fatally flawed I'd like to know so I can remove it from my site to avoid misleading others.
John
The directions on my site for RDM indicate that the DTI should be centered on the test rod. Assuming this is done, then when measuring horizontally for RDM I calculate the horizontal circular error for sag of the 1/2" rod to be:
1-cos(sag/radius) = 1-cos(0.7/250) = 1-0.999999998 =0.00000002
To get the actual horizontal error, multiply this by the radius, 1/4"
For vertical RDM measurement the sag error would be 0.0007 inch
Where both of the above errors are over a distance of 10"
When aligning the headstock, I believe the horizontal error due to sag can be ignored. The vertical error due to sag of the test bar will cause the headstock to be corrected to point up slightly to compensate. However, this will not affect taper of work produced on the lathe because the cosine function now works in the other plane so taper due to the resulting headstock vertical angle should be unmeasurable. I assume there are some uses for the lathe where this level of vertical error would be a problem but I haven't run into them (yet).
My conclusion, assuming the above calculations are correct, is that RDM produces an error because of sag in a 1/2" test bar but this error is of no practical consequence. This may be why, as you noted, this error source is not mentioned in descriptions of RDM.
Chucks typically hold work at a very slight angle to the jaws so RDM accommodates this chucking angle in its measurement technique and calculations, allowing RDM to work despite this error source. I assumed it was difficult to accommodate chucking error when using a flat test bar; a reference to a description of alignment with a flat bar would be appreciated.
It is possible I've erred in my calculations. Since RDM is suggested as a practical and reliable method on my site I welcome input that will help me understand the method better and so ensure that the information there is correct. By the same token, if RDM is fatally flawed I'd like to know so I can remove it from my site to avoid misleading others.
John
Hi There,
possible reason for cutting a convex other than the reasons previously
stated. If you re-read my previous post, I started out by saying:
Good Luck!
-Blue Chips-
Webb
That is quite correct. I was just trying to emphasize that there is anotherWebb, the problem with that is it will show inacuracies of the head stock as well as the crossslide. One or both could be out of alignment.
possible reason for cutting a convex other than the reasons previously
stated. If you re-read my previous post, I started out by saying:
If the headstock alignment isn't known, then this test isn't very meaningful.If the head stock is aligned with the bed, it is possible that the cross slide
isn't square with the ways.(emphasis added)
Good Luck!
-Blue Chips-
Webb
I think that what you have done is actually an exaggeration of the sag, as you used more weight and placed it at the end. I think that placing a one pound weight at the midpoint would have given you a reading that would have been closer to the actual sag of the bar due to its own weight.GadgetBuilder wrote:To verify that the calculated result is not erroneous I chucked the original 1/2" test rod, set the DTI to read vertically and added a 1.5 pound weight to the end of the 10" rod; deflection was about 3 thou.
That said, it strengthens your conclusion that the sag is not a major factor.
I picked up a used Morse taper 3 milling arbor that, when I get around to it, I will use as a test bar (after verifying how straight it is). I thought about making rings for it that I can make test cuts on, and use the spacers to put them at the ends of the bar (it has a center hole in the end for support).
Steve
There is another consideration that may be forgotten in using the "RDM".....
The benefit of the method is that it is an averaging method, and so the rods don't have to be straight, nor do they have to mount straight in the chuck.
However, if the rod is NOT straight, etc, you have to find center at each measuring point, you cannot simply run the carriage back and forth.
THAT means you must adjust the height of the indicator. Now you have introduced another variable, which is the nature of the height adjustment on your holder. If not such as to adjust in a perfect vertical plane, you may not get a true reading. The indicator may have moved horizontally relative to the position it had for the first measurement, invalidating the comparative measurement to at least some unknown degree.
Since the "non-straightness" due to rod or chuck is likely to be quite a bit more than the sag, this means your error by NOT adjusting may be very significant.
However, your error if you DO adjust may also be quite significant.
Then also, if the rod is not straight, it puts a premium on measuring with the rod held exactly 180 from the first measurement for the second. The more "non-straight" the rod or chuck is, the more important an error in that setting becomes.
By the time you compensate, if you can, using a plain "two collars" measurement starts to look pretty good!
The benefit of the method is that it is an averaging method, and so the rods don't have to be straight, nor do they have to mount straight in the chuck.
However, if the rod is NOT straight, etc, you have to find center at each measuring point, you cannot simply run the carriage back and forth.
THAT means you must adjust the height of the indicator. Now you have introduced another variable, which is the nature of the height adjustment on your holder. If not such as to adjust in a perfect vertical plane, you may not get a true reading. The indicator may have moved horizontally relative to the position it had for the first measurement, invalidating the comparative measurement to at least some unknown degree.
Since the "non-straightness" due to rod or chuck is likely to be quite a bit more than the sag, this means your error by NOT adjusting may be very significant.
However, your error if you DO adjust may also be quite significant.
Then also, if the rod is not straight, it puts a premium on measuring with the rod held exactly 180 from the first measurement for the second. The more "non-straight" the rod or chuck is, the more important an error in that setting becomes.
By the time you compensate, if you can, using a plain "two collars" measurement starts to look pretty good!
- GadgetBuilder
- Posts: 139
- Joined: Thu Apr 12, 2007 2:34 pm
- Location: Newtown, CT
The concern you bring up, a bent bar, was addressed and analyzed in the original paper by John Wasser. John's original paper has disappeared again but it can be found in the 7x12 Yahoo groups files where I placed it with John's permission. The pertinent section is copied below but loses a little in the copying so if it is difficult to understand I can send a copy to anyone interested:J Tiers wrote:There is another consideration that may be forgotten in using the "RDM".....
The benefit of the method is that it is an averaging method, and so the rods don't have to be straight, nor do they have to mount straight in the chuck.
However, if the rod is NOT straight, etc, you have to find center at each measuring point, you cannot simply run the carriage back and forth.
THAT means you must adjust the height of the indicator. Now you have introduced another variable, which is the nature of the height adjustment on your holder. <snip>
Then also, if the rod is not straight, it puts a premium on measuring with the rod held exactly 180 from the first measurement for the second. The more "non-straight" the rod or chuck is, the more important an error in that setting becomes.
By the time you compensate, if you can, using a plain "two collars" measurement starts to look pretty good!
----------------------------------------------------------
Why This Method Works
The bar acts as a circular cam. With a perfectly straight bar in a perfect chuck the bar is concentric with the spindle axis. Since we don't live in a perfect world there is almost always a slight offset between the center of the bar and the spindle axis. This offset varies from place to place along the bar due to slight bends and/or imperfect mounting.
At any place you pick along the bar the center of the "cam" is some unknown distance from the spindle axis. We'll call this unknown distance 'X'. As you turn the spindle axis the high measurement will be "Bar_radius + X" and the low measurement will be "Bar_radius - X". Their average will be:
((Bar_radius + X) + (Bar_radius - X)) / 2 =
((Bar_radius + Bar_radius) + (X - X)) / 2 =
(2 * Bar_radius) / 2 =
Bar_radius
As you can see, the value and direction of the deviation have no influence on the final result. That is why it doesn't matter if the chuck is accurate or the bar has one or more slight bends.
If the bar is not the same diameter at both places we need to measure the diameters and adjust the readings. Averaging the high and low readings gives us a reading for the local bar radius. We convert that to a reading for the bar center by measuring the bar diameter and subtracting half the diameter (a.k.a. The Radius).
----------------------------------------------------------------
I didn't find a flaw in John's analysis which says you can simply run the carriage back and forth to make measurements even when the test bar is NOT straight.
Perhaps you could share your analysis so I can understand why you believe the indicator must be repositioned between measurements.
Also, do you agree that my earlier analysis concerning sag is correct or have I miscalculated somewhere? I'll be happy to address each of your concerns but it would be helpful to know when you agree a concern has been handled to your satisfaction.
Well, your cosine error is also active if the bar is not straight.
The argument is presumably that the cosine error when high is offset by the same cosine error in reverse when low, placing the indicator tip in the same position either way. This appears true if you set on center.
I think there are some assumptions inherent in that which may not be supportable in specific cases, as opposed to a general theory.
One is an assumption that the actual horizontal plane through the center of rotation is truly FINDABLE with common equipment when the combination of an off-center chuck, a non-straight bar, and unspecified and unknown errors of alignment exist.
I would argue that the usual indicator holder does not provide a measurement which allows one to actually adjust to the average center position so that the indicator tip is in fact on-center, when you may NOT assume ANY portion of the test bar is correct. Inferring it from other measurements is problematic when you have a set of unknown errors existing.
One might adjust until the readings are identical with the chuck at positions 180 apart, AND also at a local maximum. This makes some assumptions about the bar, and also is quite dependent on exact 180 positions, more so as the bar is farther off straight. It is possible to use a second indicator on the chuck jaws to get that right, but that is a second, unmentioned device.
The more "non-straight" the bar is, the more significant an error of vertical position is. The errors will not average out.
Even with a perfectly straight bar, finding the center plane with the indicator in any of the usual holders is not trivial. You must arrange the holder so that the vertical movement has NO horizontal component whatever, and then find the local maximum. It may be a good source of errors when combined with bar sag in the standard RDM even if a perfect chuck is used.
If you fail to move the indicator in a vertical line, you may not find the correct local maximum. You will find a point tangent to the circle, but not necessarily on the plane of the center of rotation.
The more off-center, the more the influence of sag, of course, as the roundness of the bar gradually increases the errors.
The point is not that RDM is useless or inherently and irretrievably erroneous. (although it cannot actually separate and pinpoint errors without an initial leveling to reduce the number of variables.)
The main point is that by the time you carefully eliminate the various sources of possible errors in the "simple" RDM technique, you could have used the two-collars method.
Since that uses an INTERNAL REFERENCE, one dependent ONLY on a direct linear measurement, it is not so subject to the various possible sources of error in the RDM.
The actual measurement is a comparison of two diameter measurements.
The test piece can be a tube of any largeish diameter desired, almost immune to sag issues, however small they may be.
The test piece automatically aligns itself to the center of rotation, and that center is not directly concerned in the measurement in any case.
The tool need not actually be on the centerline with any more accuracy than for general turning.
It is an "in-use" test, replicating the actions which the test is intended to validate...
The argument is presumably that the cosine error when high is offset by the same cosine error in reverse when low, placing the indicator tip in the same position either way. This appears true if you set on center.
I think there are some assumptions inherent in that which may not be supportable in specific cases, as opposed to a general theory.
One is an assumption that the actual horizontal plane through the center of rotation is truly FINDABLE with common equipment when the combination of an off-center chuck, a non-straight bar, and unspecified and unknown errors of alignment exist.
I would argue that the usual indicator holder does not provide a measurement which allows one to actually adjust to the average center position so that the indicator tip is in fact on-center, when you may NOT assume ANY portion of the test bar is correct. Inferring it from other measurements is problematic when you have a set of unknown errors existing.
One might adjust until the readings are identical with the chuck at positions 180 apart, AND also at a local maximum. This makes some assumptions about the bar, and also is quite dependent on exact 180 positions, more so as the bar is farther off straight. It is possible to use a second indicator on the chuck jaws to get that right, but that is a second, unmentioned device.
The more "non-straight" the bar is, the more significant an error of vertical position is. The errors will not average out.
Even with a perfectly straight bar, finding the center plane with the indicator in any of the usual holders is not trivial. You must arrange the holder so that the vertical movement has NO horizontal component whatever, and then find the local maximum. It may be a good source of errors when combined with bar sag in the standard RDM even if a perfect chuck is used.
If you fail to move the indicator in a vertical line, you may not find the correct local maximum. You will find a point tangent to the circle, but not necessarily on the plane of the center of rotation.
The more off-center, the more the influence of sag, of course, as the roundness of the bar gradually increases the errors.
The point is not that RDM is useless or inherently and irretrievably erroneous. (although it cannot actually separate and pinpoint errors without an initial leveling to reduce the number of variables.)
The main point is that by the time you carefully eliminate the various sources of possible errors in the "simple" RDM technique, you could have used the two-collars method.
Since that uses an INTERNAL REFERENCE, one dependent ONLY on a direct linear measurement, it is not so subject to the various possible sources of error in the RDM.
The actual measurement is a comparison of two diameter measurements.
The test piece can be a tube of any largeish diameter desired, almost immune to sag issues, however small they may be.
The test piece automatically aligns itself to the center of rotation, and that center is not directly concerned in the measurement in any case.
The tool need not actually be on the centerline with any more accuracy than for general turning.
It is an "in-use" test, replicating the actions which the test is intended to validate...
- GadgetBuilder
- Posts: 139
- Joined: Thu Apr 12, 2007 2:34 pm
- Location: Newtown, CT
I extracted what seems to be the key part of your post, if I have misunderstood here please correct me.J Tiers wrote:Well, your cosine error is also active if the bar is not straight.
The argument is presumably that the cosine error when high is offset by the same cosine error in reverse when low, placing the indicator tip in the same position either way. This appears true if you set on center.
I think there are some assumptions inherent in that which may not be supportable in specific cases, as opposed to a general theory.
One is an assumption that the actual horizontal plane through the center of rotation is truly FINDABLE with common equipment when the combination of an off-center chuck, a non-straight bar, and unspecified and unknown errors of alignment exist.
I would argue that the usual indicator holder does not provide a measurement which allows one to actually adjust to the average center position so that the indicator tip is in fact on-center, when you may NOT assume ANY portion of the test bar is correct. Inferring it from other measurements is problematic when you have a set of unknown errors existing.
<snip>
Even with a perfectly straight bar, finding the center plane with the indicator in any of the usual holders is not trivial. You must arrange the holder so that the vertical movement has NO horizontal component whatever, and then find the local maximum. It may be a good source of errors when combined with bar sag in the standard RDM even if a perfect chuck is used.
If you fail to move the indicator in a vertical line, you may not find the correct local maximum. You will find a point tangent to the circle, but not necessarily on the plane of the center of rotation.
<snip>
Your point is well taken. My site basically echoes the original RDM paper in saying you should: "Carefully adjust the vertical position of the DTI for maximum deflection (center of the test rod's side)". A nice theory but it does not specify HOW to move the DTI to the vertical center and, as you pointed out, this is more complex than it seems.
I thought about the problem and came up with what seems like a way around it. To test the setup I used my el cheapo mag base with its C shaped hinge gadget that fine tunes position. With the test bar in place I set an angle plate on the ways and moved it up to contact the test bar, then clamped it to the bed; this provides a vertical reference plane. The test bar was removed temporarily and the mag base was installed on the cross slide with the arm having the fine tuner parallel to the ways. The fine tune mechanism was set with the opening in the C down (to approximate vertical movement) and the DTI finger extending toward the rear of the lathe. The cross slide was positioned so the finger contacted the reference plane. The fine tuner adjustment was used to move the finger on the angle plate and the change in reading was noted. The fine tuner C (including the whole arm, of course) was rotated in the appropriate direction to reduce the change in reading due to fine tuner adjustment to zero, allowing the DTI finger to be moved vertically by the fine tuner.
I believe this addition to my site will address the non-vertical movement error you pointed out and make finding the horizontal plane through the center of rotation possible.
I view the error from non-vertical movement as a first order error - failure to address it could cause a noticeable error in performance of the lathe. Other practical problems, like sag error, cause second order effects which I believe are unlikely to result in easily detectable performance errors. Simply assuming all errors are first order is always safe; correctly classifying errors as first or second order is often a challenge.
I found that RDM required iteration when I used it to correct the alignment on my 7x12. Part of this was establishing the effect of shim thickness but this non-vertical movement issue could well have been a contributor.
I appreciate your time and effort in helping me better understand RDM.
The actual point is the following:
The two collars method, for instance, is very direct, and requires no more equipment, as no indicator is needed, only a micrometer.
You do have to prepare a test specimen, but the preparation is trivial, and the test specimen can be used many times.
Essentially that if a "simplified" test method becomes more complicated than a different one, it may just not be the right method.The main point is that by the time you carefully eliminate the various sources of possible errors in the "simple" RDM technique, you could have used the two-collars method.
Since that uses an INTERNAL REFERENCE, one dependent ONLY on a direct linear measurement, it is not so subject to the various possible sources of error in the RDM.
The two collars method, for instance, is very direct, and requires no more equipment, as no indicator is needed, only a micrometer.
You do have to prepare a test specimen, but the preparation is trivial, and the test specimen can be used many times.
- GadgetBuilder
- Posts: 139
- Joined: Thu Apr 12, 2007 2:34 pm
- Location: Newtown, CT
I think I went down the garden path on my initial analysis by accepting the argument that finding the tangent rather than finding the horizontal plane through the center of rotation causes a problem. From there I went into solving the problem of moving the DTI in the vertical plane and things went down hill rapidly from there
The following addresses the actual concern: the DTI doesn't contact the test bar at the desired horizontal plane through the center of rotation.
The test bar is chucked at a slight angle and so it describes a narrow cone around the center of rotation. The DTI sees it as a circle where it contacts this circle slightly above or below the desired horizontal plane through the center of rotation. When the contact point is not on the horizontal plane through the center of rotation then the apparent error measured differs from the actual error. This reported error differs by the cosine term discussed earlier. The effect of the cosine term is to magnify the reported error somewhat non-linearly - the farther the DTI is off the center of rotation (as a fraction of the test bar radius) the greater the magnification.
So, how does this affect RDM in practice? It seems that RDM will over-report the magnitude (but not the direction) of error by a small amount for the small angular errors (say 10 degrees or so) in the contact point likely in practice. However, determining shim thickness is generally an iterative process so the user is unlikely to notice the change in sensitivity and will simply shim as needed to get the error as close to zero as possible - often limited by patience
And this is how RDM worked for me, i.e. I tried a shim to see how it changed the error, evaluated the effect, made an additional correction, etc.
RDM seems robust in the sense that it provides feedback that lets the user iterate directly toward a solution. This analysis also says that the measurements are not always precise enough to move to the solution in a single step - and this seems to be the point you are making with the concerns you have raised.
The following addresses the actual concern: the DTI doesn't contact the test bar at the desired horizontal plane through the center of rotation.
The test bar is chucked at a slight angle and so it describes a narrow cone around the center of rotation. The DTI sees it as a circle where it contacts this circle slightly above or below the desired horizontal plane through the center of rotation. When the contact point is not on the horizontal plane through the center of rotation then the apparent error measured differs from the actual error. This reported error differs by the cosine term discussed earlier. The effect of the cosine term is to magnify the reported error somewhat non-linearly - the farther the DTI is off the center of rotation (as a fraction of the test bar radius) the greater the magnification.
So, how does this affect RDM in practice? It seems that RDM will over-report the magnitude (but not the direction) of error by a small amount for the small angular errors (say 10 degrees or so) in the contact point likely in practice. However, determining shim thickness is generally an iterative process so the user is unlikely to notice the change in sensitivity and will simply shim as needed to get the error as close to zero as possible - often limited by patience
And this is how RDM worked for me, i.e. I tried a shim to see how it changed the error, evaluated the effect, made an additional correction, etc.
RDM seems robust in the sense that it provides feedback that lets the user iterate directly toward a solution. This analysis also says that the measurements are not always precise enough to move to the solution in a single step - and this seems to be the point you are making with the concerns you have raised.
Not really.......
It is a matter of a "simple" process which turns out to have more complexities, each generating a small error if not corrected. And the fact that it ends up being an indirect measurement.
And the fact that there is an alternate simple process which requires NO such set of compensations, a process which gives a direct answer including magnitude, and is a direct measurement of the effects of the condition being tested for.
The question is how long will one pursue the "complicated but simple" process when the alternate exists?
As an academic point, I suppose it could be pursued, but it seems to be needless. It clearly is difficult to get down to tenths with RDM, as the errors all generate measurement noise in that range.
So why not look at alternatives, when a known alternate method gives you the tenths in one simple measurement?
Particularly when you must first make a different measurement and corrective action in either case ("leveling" out the twist), with all but the shortest and stiffest machines.
It is a matter of a "simple" process which turns out to have more complexities, each generating a small error if not corrected. And the fact that it ends up being an indirect measurement.
And the fact that there is an alternate simple process which requires NO such set of compensations, a process which gives a direct answer including magnitude, and is a direct measurement of the effects of the condition being tested for.
The question is how long will one pursue the "complicated but simple" process when the alternate exists?
As an academic point, I suppose it could be pursued, but it seems to be needless. It clearly is difficult to get down to tenths with RDM, as the errors all generate measurement noise in that range.
So why not look at alternatives, when a known alternate method gives you the tenths in one simple measurement?
Particularly when you must first make a different measurement and corrective action in either case ("leveling" out the twist), with all but the shortest and stiffest machines.
- GadgetBuilder
- Posts: 139
- Joined: Thu Apr 12, 2007 2:34 pm
- Location: Newtown, CT
I have concerns with the two collars method paralleling the concerns you expressed with RDM. Your description of the two collars method is copied here for easy reference:
I expect for best results the pipe used as the test piece must be steel, and cut easily with a good finish - so a randomly chosen pipe may not be appropriate based on the type of steel. Also, pipe is produced using different processes and these processes affect the turning characteristics of the pipe. For example, some pipe has a welded seam that might affect the hardness locally and the roundness when turned. Other processes for pipe production may not leave the metal grain structure and hardness the same all over and/or may leave stresses in the metal that are relieved by turning -- unless the steel pipe has been heat treated to normalize it. The effects on roundness of the collars may be small but I don't think it can be dismissed when validating your claimed accuracy of tenths - unless careful analysis shows the effect is negligible.
Your procedure for cutting the collars highlights your attempt to minimize error from tool pressure. However, tools aren't infinitely sharp and they don't cut with zero pressure -- there must be some pressure exerted on the work by the tool in order to cut. Although you specify a large diameter pipe it still deflects at the far end from this tool pressure, similar to the test rod sag issue you raised with RDM. EXCEPT here the pressure isn't due to a known weight, it is due to tool pressure PLUS weight of the test piece. To analyze this error source you must measure the pressure the tool exerts and add it (as a vector) to the weight, then calculate the movement of the test piece due to this combined force to find the bending effect on the test piece and then decide what effect this deflection has on the diameter produced. Similar to what I did in response to your concern with bending of the test bar in RDM, except here it is FAR more complex. The difference in deflection between the chuck collar end and the outboard collar end may well affect diameter slightly despite the measures you suggest to minimize this; you shouldn't ignore this unless you can analyze it and show that it doesn't affect the outcome in a meaningful way.
I think there are further assumptions about the lathe that were perhaps overlooked. Any idiosyncrasies in the gears, leadscrew, half-nuts, feed clutch, binding in the carriage handwheel, rack, etc. could affect surface finish in a subtle (or not so subtle) way. Since the two collars method uses all of these, they all should be added as items assumed to work correctly. Cutting steel at 6 diameters from the chuck without chatter may be making an assumption about the overall condition and quality of the lathe.
In making several passes without touching the cross slide, the tool must cut in both directions. Not all tools cut well going both ways so it would make sense to add this info to your procedure. Since multiple passes are made with power feed, the feed must be reversed for each pass; on my lathe this requires stopping to change feed direction, making the operation more time consuming than one might expect.
There are some additional practical details when applying the measured results to actually correct a headstock error. I found that manuevering the headstock around to insert shims was easier with the test bar removed from the chuck. Not a problem when using RDM because the procedure automatically corrects for the difference in chucking when checking the result of adding shims. With the two collars method you would likely want to avoid removing the test pipe lest you have to re-true the collars; I assume accidentally bumping the test pipe while inserting shims would require re-truing to be certain the chucking hadn't been disturbed.
A point of practical interest is whether your claimed accuracy of a tenth is useful in practice. That is, given a very accurate measurement of spindle error can you choose a shim thickness that will precisely correct this measured error? I found shimming was an iterative process affected slightly by bolt torque, where slightly on my lathe was a few tenths at the far end of the 10 inch bar. When shimming I made a guess at the necessary shim thickness, tried it, measured again, adjusted the shim and/or bolt torque, and tried again. In correcting both horizontal and vertical errors (simultaneously) I found some interaction between them. RDM easily provides rapid feedback on the effect of bolt torque; the two collars method might not work so well on bolt torque effects.
Another practical detail: when inserting shims on my lathe I had to remove the headstock so the motor, belt, change gears, leadscrew, etc. was removed; using RDM I didn't need to reassemble these until shimming was complete. The two collars method would require complete reassembly (power feed is needed) in order to make another measurement, so the iterative shimming/re-checking/re-shimming approach would be far less practical on my machine. I've only shimmed the headstock on one lathe so I don't know whether this would be a concern with other types of lathes but it is a consideration.
My point here is that I believe there are at least as many error sources, unknowns, and practical considerations with the two collars method as there are with RDM. Claiming an accuracy of a tenth infers you can PROVE that all of the above noted error sources (and possibly others) won't add up to more than a tenth - a challenge that I wouldn't attempt. And when you get into actually using either method, my experience is that ease and speed of measurement depends on the overall situation. Plus, response of a lathe to shims in the real world isn't as predictable as one might theorize or hope.
First, using a piece of large diameter pipe reduces the sag issue (which you mentioned for RDM) and limiting the length to 6 diameters further reduces this issue (but doesn't eliminate it, see below). Cutting at 5 or 6 diameters from the chuck is not a desirable situation, however, and it may affect surface finish (and diameter) differently on the outboard collar than it does on the inboard collar. Surface finish can affect accuracy when measuring diameter with a mic, not a lot but in the tenths perhaps. (RDM allows using a polished rod if noise from surface finish is a problem)J Tiers wrote:It's even easier to chuck a piece of pipe as large as convenient, and about sticking out about 5 or 6 diameters max.
cut an undercut, leaving two larger diameter rings, one near chuck, and one at end.
Skim cut the rings until smooth (very light cuts, cut with sharp tool and take several spring cuts, power fed, don't touch crossfeed). Measure size. If same, no misalinement.
If too large at T/S end, headstock is pointing towards rear somewhat RELATIVE TO BED. (possible headstock is pointing somewhat up or down, also*)
If too small at T/S end, headstock is pointing to front.
<snip>
I expect for best results the pipe used as the test piece must be steel, and cut easily with a good finish - so a randomly chosen pipe may not be appropriate based on the type of steel. Also, pipe is produced using different processes and these processes affect the turning characteristics of the pipe. For example, some pipe has a welded seam that might affect the hardness locally and the roundness when turned. Other processes for pipe production may not leave the metal grain structure and hardness the same all over and/or may leave stresses in the metal that are relieved by turning -- unless the steel pipe has been heat treated to normalize it. The effects on roundness of the collars may be small but I don't think it can be dismissed when validating your claimed accuracy of tenths - unless careful analysis shows the effect is negligible.
Your procedure for cutting the collars highlights your attempt to minimize error from tool pressure. However, tools aren't infinitely sharp and they don't cut with zero pressure -- there must be some pressure exerted on the work by the tool in order to cut. Although you specify a large diameter pipe it still deflects at the far end from this tool pressure, similar to the test rod sag issue you raised with RDM. EXCEPT here the pressure isn't due to a known weight, it is due to tool pressure PLUS weight of the test piece. To analyze this error source you must measure the pressure the tool exerts and add it (as a vector) to the weight, then calculate the movement of the test piece due to this combined force to find the bending effect on the test piece and then decide what effect this deflection has on the diameter produced. Similar to what I did in response to your concern with bending of the test bar in RDM, except here it is FAR more complex. The difference in deflection between the chuck collar end and the outboard collar end may well affect diameter slightly despite the measures you suggest to minimize this; you shouldn't ignore this unless you can analyze it and show that it doesn't affect the outcome in a meaningful way.
I think there are further assumptions about the lathe that were perhaps overlooked. Any idiosyncrasies in the gears, leadscrew, half-nuts, feed clutch, binding in the carriage handwheel, rack, etc. could affect surface finish in a subtle (or not so subtle) way. Since the two collars method uses all of these, they all should be added as items assumed to work correctly. Cutting steel at 6 diameters from the chuck without chatter may be making an assumption about the overall condition and quality of the lathe.
In making several passes without touching the cross slide, the tool must cut in both directions. Not all tools cut well going both ways so it would make sense to add this info to your procedure. Since multiple passes are made with power feed, the feed must be reversed for each pass; on my lathe this requires stopping to change feed direction, making the operation more time consuming than one might expect.
There are some additional practical details when applying the measured results to actually correct a headstock error. I found that manuevering the headstock around to insert shims was easier with the test bar removed from the chuck. Not a problem when using RDM because the procedure automatically corrects for the difference in chucking when checking the result of adding shims. With the two collars method you would likely want to avoid removing the test pipe lest you have to re-true the collars; I assume accidentally bumping the test pipe while inserting shims would require re-truing to be certain the chucking hadn't been disturbed.
A point of practical interest is whether your claimed accuracy of a tenth is useful in practice. That is, given a very accurate measurement of spindle error can you choose a shim thickness that will precisely correct this measured error? I found shimming was an iterative process affected slightly by bolt torque, where slightly on my lathe was a few tenths at the far end of the 10 inch bar. When shimming I made a guess at the necessary shim thickness, tried it, measured again, adjusted the shim and/or bolt torque, and tried again. In correcting both horizontal and vertical errors (simultaneously) I found some interaction between them. RDM easily provides rapid feedback on the effect of bolt torque; the two collars method might not work so well on bolt torque effects.
Another practical detail: when inserting shims on my lathe I had to remove the headstock so the motor, belt, change gears, leadscrew, etc. was removed; using RDM I didn't need to reassemble these until shimming was complete. The two collars method would require complete reassembly (power feed is needed) in order to make another measurement, so the iterative shimming/re-checking/re-shimming approach would be far less practical on my machine. I've only shimmed the headstock on one lathe so I don't know whether this would be a concern with other types of lathes but it is a consideration.
My point here is that I believe there are at least as many error sources, unknowns, and practical considerations with the two collars method as there are with RDM. Claiming an accuracy of a tenth infers you can PROVE that all of the above noted error sources (and possibly others) won't add up to more than a tenth - a challenge that I wouldn't attempt. And when you get into actually using either method, my experience is that ease and speed of measurement depends on the overall situation. Plus, response of a lathe to shims in the real world isn't as predictable as one might theorize or hope.
It is a very standard test run by the lathe builders for setup. They seem to have thought it was not so bad as you do.
While the error sources exist, they are "organic" i.e. part of the lathe, and so if the machine won't do the test without problems, perhaps it is better to question why a machine in that bad shape is being tested to tenths anyway.
You need not make multiple passes, that isn't for anything but ensuring a full cut all around.
The test uses the lathe AS IT WILL BE USED, and as such is a more "practical" test.
As for the tenths, you measurement should be capable of 3x better accuracy than you want the measurement to. If your error sources are up to 0.001, you don't likely know your results to better than 0.003.
Since a tenths mic is easily available, and is used for a direct measurement, that is a reduction in error magnitude vs a procedure which requires a lot of fiddling and uses an indicator reading (typically) to 0.001 or perhaps 0.0005.
Surface finish certainly should be good, however a light cut with a sharp tool on reasonable material is capable of a sufficiently good finish.
Nothing in the world requires gummy 1018 pipe to be used, one would naturally pick the nicest cutting material...... Even pipe isn't required, with large diameter material the cosine error is reduced far below that from a 3/8" rod or the like. Use alloy aluminum if you prefer...
As I said, it is a standard outgoing QC test, as well as acceptance test........ didn't get that way by being error-riddled.
Your choice, of course.
P.S.
I surely hope you wouldn't use RDM or whatever to shim the lathe to straight..... It won't give you that information, you need to level to 'close" first, but you probably know that......
While the error sources exist, they are "organic" i.e. part of the lathe, and so if the machine won't do the test without problems, perhaps it is better to question why a machine in that bad shape is being tested to tenths anyway.
You need not make multiple passes, that isn't for anything but ensuring a full cut all around.
The test uses the lathe AS IT WILL BE USED, and as such is a more "practical" test.
As for the tenths, you measurement should be capable of 3x better accuracy than you want the measurement to. If your error sources are up to 0.001, you don't likely know your results to better than 0.003.
Since a tenths mic is easily available, and is used for a direct measurement, that is a reduction in error magnitude vs a procedure which requires a lot of fiddling and uses an indicator reading (typically) to 0.001 or perhaps 0.0005.
Surface finish certainly should be good, however a light cut with a sharp tool on reasonable material is capable of a sufficiently good finish.
Nothing in the world requires gummy 1018 pipe to be used, one would naturally pick the nicest cutting material...... Even pipe isn't required, with large diameter material the cosine error is reduced far below that from a 3/8" rod or the like. Use alloy aluminum if you prefer...
As I said, it is a standard outgoing QC test, as well as acceptance test........ didn't get that way by being error-riddled.
Your choice, of course.
P.S.
I surely hope you wouldn't use RDM or whatever to shim the lathe to straight..... It won't give you that information, you need to level to 'close" first, but you probably know that......