586,754 active members*
7,537 visitors online*
Register for free
Login
Page 1 of 2 12
Results 1 to 20 of 27
  1. #1
    Join Date
    May 2006
    Posts
    27

    NC Controller difference

    This might be a dumb question but here it goes.

    What is the real difference between the following 2 (other than price)
    PC and a couple of Gecko Drives some Servo's and Mach 3

    And

    A NC Controller like Fanuc or Allen Bradley

    Thanks
    Curtis

  2. #2
    Join Date
    Dec 2003
    Posts
    24223
    The main difference is the former is a PC based Software controller and the servo's are not closed back to the control.
    Fanuc, Mitsubishi etc and PC based systems that use a controller card like Galil, the motion control is closed back to the CNC controller, in these PC based systems the PC really just acts as a HMI and the card takes care of closing the loop.
    In an open loop system (to the controller) the CNC controller has no idea where the servo's are positioned.
    Al.
    CNC, Mechatronics Integration and Custom Machine Design

    “Logic will get you from A to B. Imagination will take you everywhere.”
    Albert E.

  3. #3
    Join Date
    May 2006
    Posts
    27
    I understand the Open / Closed loop

    I was more intrested in
    Interpolation (speed / Smoothness)
    Look ahead (speed / ability)
    Constant Velocity ??

    my reason for asking is that my current setup is using mach 3 and i do not seem to get the above ablities the way i would think.

    Example:
    Interpolation.. My axis move at a higher speed than i would need, but add in intrpolation and speed goes south quick (slow )

    Constant Velocity.. My axis seem to almost step through smaller movements with a split second pause between them (g-code lines)

    Is this the downfall of using a Mach3 / Gecko style system

    FYI i am running 1200oz/In Steppers and a 68V PS (10A)
    And yes I have asked ArtofCNC no real fixes.

    Thanks
    Curtis

  4. #4
    Join Date
    Dec 2005
    Posts
    3319
    Mach is a step/direction system and it doesn't close the loop between what the steppers ultiametely do and what they're told to do.

    Sort of like driving a car down the street via radio control - do you feel the acceleration in the seat of your pants if you pull the throttel trigger harde??? The radion (computer) tells the motor to speed up and slow down but can't never feel if it did it or not. That's the easiest way to look at it. Even if you use a servo card with Mach, you don't get closed loop feedback to the PC.

    Yes the PC sent out a directive via the LPT port but until/unless the error gets too bad, it continues sending out pulses fat dumb and happy until the servo card sees a sufficient error. ONLY then, will the card notify the PC, again that will be recognzed when the next LPT port call occurs. Although the encoder feeds back to the servo card,the card doesn't feed an error back to the PC until/unless it gets too far behind.

    In my servo system, the moment the drive faults (over current, no feedback signal, lag singal, whatever) IT STOPS. It doesn't do so many steps, gets so far behind then says, "we have problem, better stop because the problem finally got TOO bad".

    Look ahead functions and interpolation (which any computer time so as to look ahead and calculations to happen for it to work), will bring thruput to a crawl. This is especially true depending on how your processor does background calculations and the priority it gives to them.

    Instead of interpolation, try doing point to point milling in very, VERY small steps. WE do it on cam profiles and you can't tell that we've got teeny tiny flats on what should be a full radiused surface.If you make 2880 flat cuts to create a circle,you can't tell it after you buff it with 320 then 600 then scotchbrite.

    At 360 cuts, the thing looks like a diamond disco ball with all the facets. Sounds like a machine gun when a roller runs along the surface.

    We have the ability to do constant surface speed tool path cutting via S speed changes for every single cut point. The RAMDRIVE program in DOS does the look ahead, the controller sends out the code, the servo amp responds and if it does so properly, the cutter keeps mozying right allong. We run real slow (inches per minute and real high spindle speeds cause we're looking for mirror like finish).

    Why an it move so slow and smooth at CSS?

    Because a 12 bit ANALOG voltage is being sent to the servo amp (in increments of 0.004vdc, +/-10vdc full scale). Between the possibiltity of the stepper hittiing a resonance frequency and or the step function not being sufficient to provide adequate control at small incremental changes, it is possible for the system to hang up or default to a safe mode or the motor starts to "shiver" and that isn't what you want/need.

    In your case, you're still dealing with a step funtion. You may have reached the limitation of your stepper motors and/or the limit for a particular step/direction system. Steppers do amazing things. But, they can't be turned into servos no matter how hard you may wish or try.

  5. #5
    Join Date
    Apr 2006
    Posts
    402
    For starters: it's not a dumb question, but you must have the opportunity to look on both sides of the fence. Fanuc etc. are dedicated and embedded controllers. This means that you cannot do much more with it than controlling coordinates on a machine. A PC is an office machine which gives you, in the end, on screen, the sum of 1 + 1.

    The advantage of a PC, fast execution of a part of a program or a whole program, disappears if it has to execute the result of a small program. This means: sending pulses to the parallel port in timeslices is crippling. The design is made to send all the data in a stream to the port. A printed page!

    A real time processor derives the time from a timer. When the timer is zero or overflows, depending on the type of processor, a new task can be started. For example the generation of one step. When the task is done, the program waits for the next timer interrupt.

    I cannot look in the Mach3 software, but it looks like it's pulse delays are generated by a loop. Load a number, decrement by one and exit if the number is zero.

    Where the embedded controller can do initialising of new instructions, calculate interpolation and maintain constant velocity in its free time, for the PC everything is overhead. The to decrement number should be, but cannot be adjusted for the specific software action of the moment.

    So the easy solution is: buy the fastest PC you can get. The number in the delay loop grows, the overhead is executed faster. This is noise reduction by numbers.

    The costly solution is: buy a Fanuc and have a specialised controller.

    The crazy solution is: do it yourself and invent new G-codes and new approaches.

  6. #6
    Join Date
    Mar 2003
    Posts
    35538
    Quote Originally Posted by curtisturner
    Example:
    Interpolation.. My axis move at a higher speed than i would need, but add in intrpolation and speed goes south quick (slow )
    What is the current set on the Geckos? Perhaps your power supply is the culprit when multiple motors are moving together.

    Quote Originally Posted by curtisturner

    Constant Velocity.. My axis seem to almost step through smaller movements with a split second pause between them (g-code lines)

    Is this the downfall of using a Mach3 / Gecko style system
    Are you sure your in CV mode? Is there a G64 at the top of your screen? Or, are the segments very very small? There are literaly thousands of Mach users, and this is not a common problem that I've heard of. ( I read every post on the Mach Yahoo support group, about 100 a day)

    Also, how fast is the PC your using?

    Here's a link to a Mill controlled by mach3, but using a Gecko G100 to send the step and direction. Very smooth and quick.
    http://machsupport.com/forum/index.php?topic=502.0
    Gerry

    UCCNC 2017 Screenset
    http://www.thecncwoodworker.com/2017.html

    Mach3 2010 Screenset
    http://www.thecncwoodworker.com/2010.html

    JointCAM - CNC Dovetails & Box Joints
    http://www.g-forcecnc.com/jointcam.html

    (Note: The opinions expressed in this post are my own and are not necessarily those of CNCzone and its management)

  7. #7
    Join Date
    Dec 2005
    Posts
    3319
    fkaCarel = I now know why it is so hard to do the simple task of retrofitting a PC that does CNC controls in place of a Fanuc controller.

    I'm going to add that to my cheat sheet. Good info that comes up over and over.

  8. #8
    Join Date
    May 2006
    Posts
    27
    As far as the power supply i tried 2 seperate PS's and is still a problem
    I am using Gecko 201's
    I also have 2 no name brand drivers that I had originally had the problem with and bought the Gecko's thinking it was the problem.

    Yes i have G64 up there and in my G-code (just in case)
    -- first thing i checked, Art worked on it with me going down the check list of the easy things and still nothing

    I have a 1.6 Ghz with 768 Ram Computer

    I also did not think this was normal, but cant get a straight answer.
    Did not want to go the route of a Fanuc unless absolutely required.

    Thanks
    Curtis

  9. #9
    Join Date
    Dec 2005
    Posts
    3319
    curtisturner: you just may have to - go Fanuc

    See post #5 above.

    If you do, the Fanuc's will have servos instead of steppers and true closed loop feedback...

  10. #10
    Join Date
    May 2006
    Posts
    27
    What is a decent price for a fanuc system 10 or 11
    everything including the servo's APROX ?

    CT-

  11. #11
    Join Date
    Dec 2003
    Posts
    24223
    Before you consider Fanuc, there are are many equal to the task. (Fanuc 10&11 are way outdated, I don't think you can buy them new).
    I have found Fanuc not very receptive to the one-off Retro-fitter etc, they are interested in the OEM's, they are Expensive and support is lousy unless you are one of the above mentioned OEM's.
    Anyone looking at an equivalent, I would suggest Fagor or Mitsubishi.
    If you are a one off user, the things to consider is whether the company offers free instruction and/or development software, Fanuc charge for everything.
    With these kind of systems, you have to be able to write the machine logic, i.e. the M,S,T codes, as these are machine unique.
    This is usually written in ladder logic or some kind of boolean arithmetic (not really that hard).
    Even if you buy a used system, the ladder etc would have to be modified to suit your machine.
    Personally, I have found Mitsubishi support outstanding.(they are used by Mazak BTW).
    You are looking at a minimum of $10~$14k.
    Al.
    CNC, Mechatronics Integration and Custom Machine Design

    “Logic will get you from A to B. Imagination will take you everywhere.”
    Albert E.

  12. #12
    Join Date
    Mar 2003
    Posts
    2139
    I would try a different PC, even if you gotta grab it outa your office temporarily. Don't change the control yet. The combo you are using is known to work and work well.

    Eric
    I wish it wouldn't crash.

  13. #13
    Join Date
    May 2005
    Posts
    2502
    Quote Originally Posted by fkaCarel
    So the easy solution is: buy the fastest PC you can get. The number in the delay loop grows, the overhead is executed faster. This is noise reduction by numbers.

    The costly solution is: buy a Fanuc and have a specialised controller.

    The crazy solution is: do it yourself and invent new G-codes and new approaches.
    The explanation about timing loops was directionally correct, but pretty far from reality. I don't know that I would add that to the cheat sheet as is. Likewise NC, it isn't exactly true that the moment your drive lags it stops while the Gecko system is somehow inferior because it doesn't communicate until enough error occurs. Both systems have feedback loops by their nature. There is always some amount of following error there in a closed loop system as it struggles to keep sync'd with the encoders. Both systems have a threshold beyond which they fault, and that threshold has to be greater than 0 or the feedback process can't happen.

    Gecko has chosen a value of 128 steps at which to fault on the controller. Is this a bad value? Too large? Kind of depends on what a "step" is? In this case, it is encoder counts, which by virtue of quadrature means 1000 per revolution for a typical "low res" 256 step encoder. So 1/1000th of a servo resolution, translated through whatever drive reduction (2:1 or 3:1 = 2000 to 3000 per revolution), translated through whatever your leadscrew offers.

    Put all of this together as 1000 resolution per rev * 3:1 reduction * 5 TPI and you are looking at 128 * 1/15000 of an inch = 8.5 thousandths. Now that is for a fault, and in a lot of systems you are done once you faulted, meaning you can't just restart from that point. At the least you probably need to rehome the system because it may have lost track of where it is. You therefore don't want to fault at the drop of a hat by setting that tolerance too low. It is pretty routine for things to get back on track as the servos catch up again. Some systems even let you set a time parameter for how long the following error can exceed the maximum before you get a fault.

    I would be curious to know what that threshold is for the Fanucs and such. We might find it is not so different. EMC uses a default of 0.010", which is actually a greater error than is built onto the Gecko boards in this example. I Googled for a bit and found a Grundig CNC paper on MMSOnline that used the same value, so maybe that is conventional wisdom.

    The Heidenhain guys have a fascinating account of how "closed loop" isn't really closed loop unless you measure how far the axis really moved using linear scales as opposed to whether the servo "lost steps":

    http://www.heidenhain.com/wcmsmimefi...3_20_15180.pdf

    The figures they give about how much error temperature related effects cause along these lines are pretty interesting, so they have a point too in this endless religious debate. They are just simply a little more rarified sect.

    But we digress!

    It is the conclusion that counts:

    - Buy a faster PC. Yup. I see guys over on the Mach board wondering why a 333 MHz machine is struggling. Get a real PC, 1 GHz or so. They're cheap. Much cheaper than dedicated controls. Watching the video offered by Ger21 it would be hard to conclude that the description of the difference between a Windows based system and an embedded controller was as simple as what was presented (it isn't!).

    - Use Servos with Gecko drivers, they will fault just like the big guys if steps are lost. While the faulting parameter is not tunable, it is also not bad as a default. Running Mach with Geckos is pretty darned easy, there is tons of free support on these boards, and it has been done successfully by many folks. There are also commercial OEMs that use this technology as well as OEMs like Tormach that are happy enough with steppers.

    - Failing that, look at running encoders with steppers if you need to catch the lost steps. That also works just fine. Rogers Machine will sell you a card and the Mach 3 bits needed to make that work for you.

    - Look at the GRex. It is immature at present, but offers a host of advantages. In particular, it runs the nasty timing loop portion that we seem to feel is a problem for Windows on an embedded microprocessor. The performance being seen with it is apparently astounding. It comes with encoder inputs for all axes. Slightly longer term Mach will be enabled to work full closed loop with it.

    - If you don't want a project, buy someone's turnkey conversion kit. I read about just about as many headaches with those and much greater costs as I do with Mach, however. So beware, you may have a project there as well.

    - If you really don't want a project, buy a brand new turnkey CNC machine from Haas or someone similar.

    BTW, in all of this minutiae, we have overlooked some of the most important differentiators of the industrial grade CNC stuff:

    - It is typically set up by an experienced OEM for a particular machine combination more often than a one off or conversion. They have gotten the bugs out (hopefully) before you get your hands on the machine.

    - It is carefully tuned for that combination of machine, motors, drivers, and control.

    - The software may be highly optimized for the machine.

    - The front panel and conversational modes may be a lot more advanced than one finds with Mach 3. It's quite a bit of effort to build a nice control panel for Mach 3. It's been done and is very doable, but you start to move out to a more rarified atmosphere.

    - There are some pretty nifty very high end features available in controls like Fanuc that are not yet there for Mach 3. For example, there is an option to directly interpret NURBs instead of approximating them with line and arc segments. Other much talked about options are look ahead and interpolation. These are mechanisms to smooth the roughness of those line and arc segments, and to look ahead at what is coming so that if the axis has to change direction too suddenly, it is planned for and the following error isn't too great. The machine may have to accelerate/decelerate before it finishes the current block of G-Code in order to meet the needs of the next block in these scenarios. The Grundig article I mentioned earlier talks about some of this (http://www.mmsonline.com/articles/059604.html). If you are buildling $2000 titanium + carbon fiber toilet seats for the Pentagon's stealth bomber (can't have a radar signature on your toilet seats!), this stuff is important!

    With all that said, Art is making huge strides with Mach. His new plug-in architecture is going to open it up to a lot more contributors, and progress should accelerate from there. He says he has 5400 users. His stuff runs on the world's most popular computer and OS architecture. History in the computer industry has shown that this stuff will generally win out over the proprietary approaches, and it won't take all that long to do so.

    Heard anything from DEC or Cray Research lately?

    Cheers,

    BW

  14. #14
    Join Date
    Dec 2005
    Posts
    3319
    The problem is that many folks, including myself, don't REALLY understand what is going on between any number of key strokes we make and the piece of paper that is printed because of what we typed.

    If one takes the attitude (which I did) that the computer only counts 1's and 0's, it should EASILY be able to repeatably cut a circle cuz it does so real fast and accurately.

    Assuming that this assumption was correct, I spent weeks eliminating the mechanical gremlins that I thought were causing mine NOT to do it. Even then, the darn thing would put flats at the turn around points an the 0,0 point would wander off by 0.005" or so at the end.

    The guy who knew the electronics and servo tuning tricks came and showed me that there is "magic" in obtaining the proper "tune" for the servos. There is also a real challenge in learning what the idiosyncracies are in your PC which have to be addressed in order to get it to process data as WE want, HOW we want.

    Recall that PC's were adapted TO use in CNC retrofits, they were made to DO other stuff (flight simulation, word processing, play DVD's, you know the REAL stuff PC's are used for). Big difference

    I'd contend that the guys at Mach would have a far easier time if their system were ported to a dedicated machine with known interrupt tendencies and/or whatever other gremlins they surely have to deal with that the folks in Redmond hid in XP and it's derivatives. These were only made tougher by the hundreds of ways PC m/b chipsets process the data and in what order.

    Imagine literally plugging and playing with a dedicated PC that doesn't do flakey stuff - or differenly in this box versus that. The cost of the PC might double or tripple but think of the performance you could get by not have to worry about the microprocessor deciding it was time to refile the e-mail (which it was holding open for god knows what reason), or go into battery conserve mode because the battery just dropped 1 milivolt below the conserve battery threshhold mode.

    Aaaah, the what could be's. In the mean time, compromises are made and the REALITIES of CNC become public knowledge via life's experiences.

    The neighbor's Haas cuts my cam profile masters is less than 15 m minutes. My BPT Eztrak consumes nearly 3 hours running EXACTLY the same number of lines of code. The dedicated CNC does a better and more efficient job of crunching the numbers it is supposed to crunch.

    The PC based control in our Eztrak has to be run slowly so it doesn't hurk/jerk around. It's comparatively older and compromised PC based controller simply needs more time to crunch the exact same G code numbers.

    Via trial and error we found that slowing down the process and some other tune parameters, resulted in dazzling accuracy out of a machine that shouldn't perform that good - even my BPT trained tech said so... But to do it, you have to time it with a calendar as opposed to an egg timer.

    When we oversimplify the actions of the PC into only counting 1's and 0's real fast, it is easy to lose sight of what it really takes for the myriad of different electro/mechanical components to synergistially come together to mill a simple circle.

    Them little electrons are flowing around pretty fast and, I contend, some of them fall off of the board traces and get lost. People claim it is "lost steps" but I know better.

    Found a bunch of them had collected themselves and migrated to the coolness of the refridgerator. Seems they had found a way to keep the light on when we closed the door and they were partying until the wee hours over the weekend. Does anyone know what the gestation period of electrons is? I suspect some of mine surely got "lucky".

    I don't feel it is the software vendor is to blame that the software does funky stuff on some of the infinite blend of PC's and ancillary hardware that's out there. I'd challenge the user's group to poll the user base and find out which group/brands of components gave the least amount of trouble all things considered.

    Those should then be the "baseline" system to recommend. Sadly, the way the industry is, it will be obolete in 3 months and you'll have to start all over.

    ANyway if this were done, I"d be a lot of wheel spinning and/or reinventing would not be needed in trouble shooting a bad synergy of PC's "finest offerings" (ergo right tool, wrong job).

    Maybe its time to develop DIY controls on/for a PC104 based standard system. At least they still offer simple stuff as old as industrial 486's, Pentiums and other "obolete but adequate" processors.

    A bit more pricey but at least the stuff was designed for industrial use and they don't have some of the ginger bread you probably don't need in a machine controller - like a Tomb Raider graphics card or DVD player for your Jennifer Tilly (talk about HOT) sampled movie clips. Important things, granted but not out on the shop floor.

  15. #15
    Join Date
    May 2005
    Posts
    2502
    NC, there's a TON of differences between your neighbor's Haas and your EZTrak. I have a hard time believing it all boils down to just swapping the Haas dedicated controller in place of your EZTrak's PC! I don't know that you were even suggesting that.

    The PC does a fine job, and is much more maligned than it really should be. It is too often the excuse and crutch when as you have pointed out there are a tremendous number of other factors that have nothing to do with the PC. PC's have also gotten many orders of magnitude faster since the days of EZTrack, and that makes a huge difference. I've used Unix, Linux, and Macs, and despite all the protestations that Windows crashes more, it's bunk. I make my living building large scale Unix software today, and made it in the past writing Windows desktop software like the spreadsheet Quattro Pro.

    Perhaps their greatest Achilles heel of the PC has been the use of parallel ports as a suitable real time control interface. Truly that has been a kludge, and particularly difficult to make work well under Windows. Yet, as the machines have grown faster and Mach has evolved, it has reached a stage of working pretty well.

    The next iteration in that evolution is at hand, and it totally changes the equation in a way that favors the PC tremendously. I'm talking about the GRex and similar devices that offload the innermost (and most finicky) processes to exactly the dedicated controller that you are lusting after. Yet they leave a PC firmly in control, so you keep all of the advantages of low cost, graphical user interface, and plethora of programs available there. This is really a cool innovation NC, and one I think you would find fascinating if you ever had a desire to play with it. The parallel port issue is eliminated--GRex and similar boxes communicate using either USB or LAN at much higher speeds and without the limitations. The timing issues are eliminated as well because GRex receives very high level instructions similar to G-Codes and it has a bunch of them. If the PC hiccups for a brief while the G-Rex can carry on without it during that time. The PC gets to do what it does well and the G-Rex handles the rest. This all happens at a very low incremental cost--a GRex is about $300. Some of the other USB boxes will be even cheaper. Both Art and Mariss have expressed great surprise at the improved performance on stepper based systems this provides. I mention steppers not because its a stepper centric system, but because they showed cases where steppers were performing at speeds that used to require servos in the pre-GRex days.

    I paid for my college degree consulting for Cad/Cam companies of the day. At that time Intergraph and Computervision were the Kings. They ran on proprietary hardware, and they were proud of it, and felt nobody would ever produce a product as good without that proprietary recipe. I did most of my consulting for a whole host of little firms that were writing CAD/CAM programs to run on desktop workstations. Today it would be hard to argue that the desktop machines hadn't pretty much wiped out all the proprietary players. The same is unfolding at the CNC machine level, and it is fascinating to watch and be a part of. It happened in the same way I've talked about with GRex. They took the finicky parts off of the PC's plate and moved them out to a dedicated controller--what you have referred to as a Tomb Raider graphics card. BTW, they do have a place on the shop floor if you fancy toolpath simulation at all. Today, graphic controllers actually have more transistors and are in many ways more powerful than the PC's cpu itself, yet they are cheap enough to be used for games. They have powerful vector instruction sets just like the Cray computers of my college days that were being used to design atomic weapons. They're scary powerful!

    This trend has happened many times in the computer industry, and is inescapable for CNC too. Resistance is futile--when your EZTrak finally cannot be repaired further, you'll have to decide whether to embrace the newness, or join the gang on HSM talking about how great those old Southbend lathes are and why nobody really needs CNC anyway! LOL, just teasing! But I'll bet you would enjoy learning more about this GRex phenomenon and Mach if you'd give it a chance.

    Cheers,

    BW

  16. #16
    Join Date
    Dec 2005
    Posts
    3319
    BobWarfield: the point about the Haas vs Eztrak wasn't made clear enough by me.

    Point is this: the code that both machines use is IDENTICAL (point to point G code milling).

    We found that the Trak can hog the metal away but at high speeds, the feedback isn't quite as good as that which is in the Haas. The error was stacking up to be greater.

    Dunno why, but the Trak (running on a ISA PLC card in a DOS 133 Pentium box) can only process data so fast and tends to deviate more (error) if you try to push it. So it's several masters in 1 hour on the Haas or 1 in 2.5 to 3 with the Trak with EXACT SAME G Code. The thru put on the dedicated CNC is far superior than that of the 10 year old PC based technology.

    If a PLC based "professional" machine (Trak) with true servo direct F/B has issues running fast and keeping up, how can you expect a device that is running (brilliantly and amazingly well) thru a parallel port to provide comparable performance???

    I understand what you're saying although it may not quite show. I'm also glad you're explaining it to all of us. It really puts this PC based CNC thing into a better perspective. This post will be put into my "cheat sheet" for future reference when the well intended but ill concieved "we can hook any old computer up to it" comment gets made in the future.

    Simple case of right tool for right job and every tool has its limitations and wishing won't make it so.

  17. #17
    Join Date
    Dec 2005
    Posts
    3319
    BW: I"m in the odd position of NEEDING CNC based tools but unable to afford the Haas version that I should be using. As such, I have to rely on "legacy" hard/software to keep my retirement project/business affloat until the 'end' or until the lotto deal rewards me - maybe I ought to start buying tickets.

    I"d jump to a Mach in a heartbeat BUT when I contacted them direclty about being able to interface with servos, they said it couldn't be done. They said, "you'll have to write code" which immediately rules it out - I'm not in a postition where I have the where with all to learn how to write code. Amazingly, I learned on this site that Mach could run with servo's but the decision process has moved on since.

    I have no doubts that the 'Trak's lame yesteryear technology has already been eclipsed with newer technology. Yet, this 10 year old system does stuff in such an easy to use, well founded, brainless fashion, it simply amazes me. Even the neighbor with the Haas and the BPT guy who came in to tune it sat there in astonished awe watching it do what (as one guy admitted) he'd only seen a Haas tool room mill do. And we were told it couldn't be done....

    Sorry to digress...

    Some of us are looking for MANAGEABLE "heathkit" (actually component stereo) like projects to do CNC retrofitting on/for/with.

    We're not code writers nor are we all ME's or EE's or even college educated at all although some of us purport to be. Some of us simply long for the day when Grex will be ready to assemble into a psuedo plug together control system. Just don't get super side tracked - don't let mission creep set in. Hard point the features list, leave room for growth/development and get something that works.

    IT would be neat to fire up a system where a nice screen is IN THERE already and not something you have to write VB or whatever with to make it go. No time to do, license one that's already written by a Mach user. If it is SO easy to do, provide it with the system as a default boot screen for lathe or mill and you immediately have a more user friendly device - don't like what you see, go (enter manual page here for DIY modification instructions).

    As much as I hate Windows, it did something good for society. It gave almost anyone the ability who could turn on a PC simply boot it up and start doing productive stuff. When you quit having to self program a computer and bought functioning software, the sales exploded. I dare say, that a lot of small shops out there are in exactly the same position as I, more or less. Need it, wan't it, don't have the time or skill to do it ALL ourselves....

    If you can turn on the stereo and listen to and enjoy fine music WITHOUT having to LEARN music, why should you have to learn to do all sorts of fancy programming to make a so-called mill or lather retrofit "kit" work???? Kit, hardly. DUH.

    A DIY kit doesn't have to be an entirely DIY KIT - I don't think some of us are looking to resurrect Heathkit but something more like the "component days" of stereo creation might do wonders. Gee, do you think Centroid maybe got started doing things just that way???

    As bad as Gates and Co are, they knew how to make a product good enought to get the world, even folks who only know how to turn a PC on, hooked into relying upon it to be productive.

    I hope this makes sense. Oh so much more can be done by those of us who NEED what the Mach can do but WANT NOT to have to do some of the stuff needed to do to make it go in their application.

    Maybe Grex is the salvation. Maybe something else is. Time will tell. Got to go get some of my lame a$$ yesteryear technology interfaced and operational so I can pay more taxes next year and cover luxuries like gas, food and rent.

  18. #18
    Join Date
    Apr 2006
    Posts
    402
    The explanation about timing loops was directionally correct, but pretty far from reality
    Perhaps their greatest Achilles heel of the PC has been the use of parallel ports as a suitable real time control interface. Truly that has been a kludge, and particularly difficult to make work well under Windows. Yet, as the machines have grown faster and Mach has evolved, it has reached a stage of working pretty well.
    I stated not to be able to look in the software, so it was my estimated analysis. So what is the reality?
    - Look at the GRex. It is immature at present, but offers a host of advantages. In particular, it runs the nasty timing loop portion that we seem to feel is a problem for Windows on an embedded microprocessor.
    I think the "embedded" has crept in, but it confirms my opinion. The function of the GRex is just to time and buffer, lets call it a buttonless, invisible and limited embedded controller. It redifines commands. It's sales pitch is to make interrupted data streams continuous. Reading the comment of NC an effort to make a high resolution continuous circular interpolation data stream would be welcomed. These kinds of internal resolutions need also to be a part of a discussion, if they obvious need to be refined.

    Watching the video offered by Ger21
    A video is 30-50 frames per second. How you can use a video to disprove time domain differences which are at least under 1/25000 sec is a mystery to me. This is just slow oscilloscope domain.

    And then in general: servo's call for a dedicated servo-controller. The higher the encoder count, the better the PID function works. So if the maximum output of a PC is 50000 pulse/second then the max rev/min of a servo with an 1000 encoder count in quad mode is: 50000 / 4000 * 60 = 750 rev / min. This speed limit already exists. I would'nt trade speed vs hold or vs resolution.

    A tighter integration with a dedicated processor eliminates all these data-pipelines. And, do not mistake me, I am not against the PC as data-generator and visiualiser. It's just the eternal question: what is the best solution for a given problem? From my point of view there is a little too much band-aid in the PC approach.

  19. #19
    Join Date
    Dec 2005
    Posts
    3319
    Here' the problem.

    Joe DIY"er buys a Mach whatever. He's gona CNC his whatchamaycall it. He can't/won't do the math. Suddenly, something doesn't work and the 3000rpm motor spins the encoder at a date stream rate that is beyond the count ability of the stuff he's trying to interface. Someone will do the math and point it out but the customer will feel ripped off none the less.

    My fear is that our Australian friend who's buying stuff of Ebay to retrofit his lathe inaother thread is setting himself up for just that sort of rude awakeningl. Plug and play assumptions that lead to keen eyes for what should be "obviously it won't work".

    But if it were obvious, how come the customer didn't see it? DUH. So much for overestimating the intelligence and/or education level of the end useer. Yes, it is a cerebral hobby but stuf gets easily overlooked or OOPS'd..

    Yes CNC is NOT plug and play - but the guy who makes it so will become inodinately wealthy. TRW did that in the automotive aftermarket and helped make the market viable by making their products fairly idiot proof -even though bigger idiots ultimately evolved to be dealt with but they were and the overall level of the product rose in the process.

    DIY CNC vendors take heed. PLug and pray can become plug and play with a bit more development effort abd a higher performing product.

    EDIT

    At one point 133mhz was fast, now 1.5 gig PC's are passe.

    1500 rpm lathe spindles were common, now 4K spindles are "So what?"

    386 gave way to 486 to Pentium to Celeron to ?????

    7000 rpm auto engines yesterday are 9k in Nascar versus 10k in NHRA Pro Stock today and 19k in F1. Tomorrow ?????

    END EDIT
    .

  20. #20
    Join Date
    Apr 2006
    Posts
    402
    At one point 133mhz was fast, now 1.5 gig PC's are passe.

    1500 rpm lathe spindles were common, now 4K spindles are "So what?"

    386 gave way to 486 to Pentium to Celeron to ?????

    7000 rpm auto engines yesterday are 9k in Nascar versus 10k in NHRA Pro Stock today and 19k in F1. Tomorrow ?????
    No. I worked on an 1978 Gildemeister lathe with a Fanuc control. It was a highly functional, speedy and powerfull lathe. It produced parts that passed Quality Control. There was no way to visiualise programs, you had to do it in your head.

    It had a processor, designed in those days. Looking at this machine and comparing it with the PC related problems arising now, in 2006, and centering my attention on the seemingly biggest problem: threading, I only wonder where things went wrong. I have drawn my own conclusions.

    Everybody who designs and builds its own machine is an integrator. You should know the problems you face or have the ability to master them. (the learning curve)
    So I have no fear for the results nor pity for the people who starts these projects. This week I spoke with somebody who also induced the factor "luck" in his professional projects. I told him to wake up and not to include me in this.

    And on a side note: I call "Plug and Pray" "Advanced Reset Technology"

Page 1 of 2 12

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •