Wednesday, December 18, 2013

Evolution of Engineering Simulation, 3D CAD and New Opportunities

My first 3 years in engineering school were dependent on the slide rule.  I still amazes me that we were able to do logarithms and trig functions to sufficient precision to do almost all of our calculations.  One exception was in surveying where we would have to use tables of logarithms carried out to 10 or more places.

In my senior year the handheld calculators became available and engineering changed overnight.  I paid an extra ~ $50 for my first to get the square root function.

45 years of Engineering Simulation.
My first engineering job was working for an aerospace company in Southern California in the late '70's.  I was one of 60 or so stress engineers working on some pretty high tech energy and propulsion systems.  Some our creations put many an astronaut safely into space with safe returns.  Our "simulation" or as we called them at that time "analysis" tools were mostly "classical" or "hand" calculations and Roark's Handbook was our "Bible."  We did have access to some FE tools.  We needed to fill out the 80 column Hollerith card sheets and sweet talk the basement residing keypunchers to punch the cards for us.  Our jobs would run overnight on the IBM 370 and be ready for us the next morning with 1-2 inch thick printouts to sieve through.  Many of my colleagues would build caves around their desks with their collection of these printouts.  I suppose a prelude to the present day cubicle.

Our thermal analysis brethren worked similarly as did the flow and vibration folks.

All the design work was done on drafting boards.  The high tech of the design room was the electric eraser.

Jump ahead to the mid 1980's and CAD made the scene. The CAD terminals and software were so expensive and difficult to use that the dim lit CAD room was developed with only the CAD specialists allowed to use the new stuff.  The design boards were still everywhere.  FEA had progressed a bit and shared dumb terminals were available for data entry.  The 80 column input format was still prevalent.  Post-processing was still sorting through reams of paper or viewing columns of numbers on a CRT screen.

Jump ahead to the late 1980's and the Mac and mouse were introduced.  Didn't do much for engineering yet.

Into the mid 1990's and PC's were now pretty common in the more progressive companies.  3D CAD was starting to make some headway but it was mostly limited to dedicated designers in the companies that made the move.  CFD was becoming available.

In the late 1990's 3D CAD was very common.  Many engineers were becoming proficient at using it and doing designs from scratch and then passing them off to the detailers who many times would redraw the models to create the 2D drawings.  FE and CFD now had GUI's.

By 2000 3D CAD had matured to a near commodity.  The choice was usually based on seat cost and who you thought would stay in business. Some of the early market leaders were bought up or went out of business because they could not keep up.  The Windows PC was becoming the platform of choice for desktop and engineering applications.  FE and CFD tools could start to use CAD geometry data directly.

From 2000 to now the analysis tools have evolved to include "multi-physics" with structure, thermal, flow, electro-magnetics and so on becoming more integrated with their respective interactions.  Using CAD geometry is standard procedure.  Many CAD systems have their own simulation tools built in or at least with connections to it. The job of designer (drafter) is less defined and in many companies they are phasing out or evolving to more of the design role rather than just drafters.

Where is this all headed?  I suspect in the short term there will be even more integration and further ease of use.  Many of the general purpose simulation tools will introduce more and more physics and multi-physics capability.  Users won't have to be as specialized to get results.

With these advances opportunities will also arise.  

The quality of the results will have to be checked more so than in generations past.  In the slide rule era one had to have a feel for the answer just to know where the decimal point should go.  One would have to know which text book equation was the most appropriate.  Today's tools don't require that level of intuition and it has become very easy to generate gigabits of results in milliseconds.

Many of the general purpose simulation tools try to be everything to everybody.  That's ok but with that comes the burden of layers and layers of windows and tabs for the input.  For example,  try to do a "simple" static beam simulation in one of the general purpose FE packages.  It will typically want a STEP file from a 3D CAD model for the geometry.  If that is not available the user could create a 3D model in one of the add-on GUI's or try and use a "stick" model (2D representation instead of 3D).  But then there's a  dozen or so windows to define the section properties, boundary conditions, loads, material properties, desired output, and so on.  Want to do a transient response simulation on the same beam?  Add another dozen windows to meander through for selecting the modal or direct solution, time steps, load table, etc.

If you are a "causal" user of the software you might have an answer in a few hours or even a day or two after starting. That is further dependent on whether you can even get access to the software because many of the general purpose packages are 10's of thousands of dollars to purchase and maintain and therefore most companies keep the number of seats in check.  A more experienced user might plow through this in an hour or two.  But in either case if a trade study is needed on the subject beam then many more hours might be necessary.  Macros could be written but that requires it's own expertise and usually only worthwhile if this will be a common scenario in the future.

Many of the general purpose tools are just not appropriate for some simulations.  Try modeling and optimizing a natural convection finned heat sink with several different heat source and fin scenarios.  Yes, there are CFD packages that can technically do this.  But it would typically take days to get the desired results and would usually have to be done by a CFD software specialist.

For preliminary or conceptual design studies the general purpose simulation tools and 3D CAD are not well suited and in most cases way overkill.  This part of some development projects requires quick scenario playing and dozens or more what if's quickly assessed.  The ideas that do not meet the general criteria are quickly tossed.  Trending is also often done.  General limits established.  For example,  for a thermal management application one needs to quickly decide if air cooling is sufficient and if so, does it have to be forced or will natural possibly work.

As a result specialized software modules are now starting to be introduced that make these types of studies much more efficient before the more general purpose tools are needed or even desired.

Sunday, April 26, 2009

Bicycle evolution indicative of overall technology evolution

The ever increasing pace of technology evolution is simply amazing. Examples of this are everywhere from communications(pony express to the pda in 150 years); engineering computation(sliderule to desktop workstations in 30 years); to audio(vinyl records to the ipod in 80 years). Step changes at the start of the 2oth century would typically take 2 to 3 human generations. Now 2 to 3 step changes are occurring within 1 human generation.
An example I'm very familiar with is the technology evolution of "upright" bicycles. The same basic derailleur bicycle had not changed from the earlier 1900's to the late 1970's. That vintage bike had brazed thin-walled steel tubing. Aluminum components. Two chain rings. Five rear cogs. Downtube mounted derailleur friction shifters. Leather saddle on aluminum seat post. Cleated cycling shoes with leather toe straps. Wheels were thin walled aluminum with typically 32 to 36 14 to 16 gage stainless steel straight or double butted spokes. Tires were either clinchers with tubes or the lighter weight glue-on tubulars. Jerseys and shorts were wool with a soft leather seat chamios. Helmets, if they were worn, were leather strap nets. Racing weights were 20 - 21 lbs. The speedometer was a wristwatch and road markers.

Today's model is a monoque carbon fiber frame and carbon fiber fork. Handlebars, seatpost, cranks, deraulliers, saddle cage and wheel rims all have carbon fiber options. Triple chainrings are common. Ten spocket cogs are common. Handbar mounted index shifting with integrated brakelevers are standard. Step-in toe cleats are used. Wheels can be monocoque 3 trispokes to 12 -15 bladed spokes. Even full disked wheels are relatively common. All components can be had in aerodynamic shapes. Spandex skinsuits with synethic chamios are common. Handlebar cycling computers with gps, heartrate monitors and training programs with downloadable databases are available. Aerodynamic, plastic shelled helmets with built in head phones are used. Electric hand and foot heaters are available.

The racing bike has evolved so quickly that the governing racing associations have had to implement weight and dimensional standards to slow the progress. For example the UCI (Union Cycliste Internationale) weight standard has been set at 15 lbs even though technology can provide weights a few pounds less than this.

In the early 1970's a time under an hour would win most 25 mile time trials including the US nationals. Today it takes closer to 48 minutes. Some of this improvement is due to better conditioning and more participation but the bulk of the improvement has been the equipment. Compare this to the 10,000 meter world track running record improving from 27:39 in 1965 to the current 26:17 in the same time period.

If the "upright" bicycle is compared to the broader field of human powered vehicles the analogy becomes even more amazing as shown by the hour records in the graphic.



Friday, March 27, 2009

CAE Tool Effectivity and Opportunities

CAE design tool development has made continual and sometimes amazing progress in the last 3 to 4 decades. 3D, interconnectivity and multiphysics are readily available. However, as an end user, manager of users and even developer of such tools I see several keys challenges or perhaps opportunities in the field:

- Expense: in many cases the expense of the tool(s) keeps a large engineering segment from using them. Some of the systems can be priced upwards of $50k and higher with correspondingly high hardware and maintenance costs. Certainly options exist for access to the tools such as consultants and in some cases pay as you go use but these options are many times cumbersome.

- Hardware capability lag: Even with 64bit processing, dual core, hugh RAM.....some CAE models take too much time to develop, execute and results process to really be effective in the design process. They at best may offer a final analytical validation but are ineffective for routine iterations for optimizing a complex design. I see some movement towards internet hosted systems and/or clustered systems that may help crack this problem coupled with the continual computer and memory improvements. Other options include supercomputer time sharing but that can be expensive.

- Casual users: Even though many software developers tout the user friendliness of their wares in most cases a casual user is always relearning how to use the tool. This reduces their effectivity. The larger and well funded organizations can perhaps afford the "full time expert" but it's not as common as in years past.

- Software sophification exceeding many users experience: In many instances the tool has too much capability for the users. Doing a 3D CFD simulation without understanding the assumptions behind the algorithms can lead to garbage in garbage out without anyone necessarily being the wiser until perhaps late in the process.

- Keeping with the prior generation process box. The example I repeatedly cite are the teams that spend hours and hours developing 3D CAD models and then revert back to creating 2D drawings for communicating to other team members like suppliers. I see no technical reason why this is necessary. The 3D model with perhaps referenced specs should contain all necessary information.

- Forgetting Occam's Razor or probably better known as the KISS principle. Many models end up being way too complex for the need. The software has so many features and capabilites that the user is too easily pulled into trying to model virtually everything and many times ends up with a very cumbersome model. This is more of a management issue. It needs to be continually reinforced that modeling should be taken in progressively more complext levels and never beyond the need of the design. If it doesn't work for the simple hand calculation it likely won't work for the 100,000 node FE model that might take days or even weeks to develop.

- Accuracy understanding. Many users lose sight of the bigger picture with respect to precision. An FE stress analysis will never be any more accurate than the certainity of the loads, boundary conditions and/or material properties.

- Virtually all the analytical CAE tools available today performing analyses on an existing concept. This certainly is a value added capability but the next leap forward will be evolving them to direct design tools. In other words, the tool will provide direct design content with assist from the analytical tools semi and or fully automatically. This will likely bring expert systems and intelligent design to bear in the process. Today's process is to layout a concept generally in a CAD system. The design is then ported to the various analytical simulations for stress, dynamics, CFD, thermal and so on. Each simulation provides feedback to the team for decisions on needed changes. The next generation tools will take the set of design needs and criteria and the tools will provide the design or at minimum various options.

- ...................

Thursday, March 26, 2009

vibration/shock isolators provide double benefit

Vibration and shock isolators have been used forever in countless applications. A recent application I have been associated with highlights a double benefit that can be gained by their use. A diesel engine propulsion system on a railroad locomotive has isolators that are used to mount the engine to the locomotive frame. The engine is very dynamic and has relatively high vibration levels from the normal sources like driveshaft unbalance, piston firing and so. The isolators reduce the amount of this vibration that gets transmitted to the frame. The benefits are lower noise, reduced dynamic loads on the frame and adjacent equipment, increased crew comfort, etc.

The second benefit is for the engine itself. Locomotives experience high shock loads from coupling into rail cars and other locomotives in building up the consists and from pulling and braking. The coupler shock loads are somewhat attenuated by the coupler draft gear which is typically a laminated rubber bumper, however, some shock load still occurs at the frame and makes its way to the engine mounts. For this case the shock load is isolated from the engine by the isolators.

A generally inexpensive device serving two important functions: reducing engine loads passing into the frame and reducing the frame transmitted shock loads passing back to the engine.

Saturday, October 27, 2007

Occam's Razor

William of Ockham (c. 1285–1349) a 14th-century English Franciscan friar is credited with the principle that states that one should not make more assumptions than the minimum needed. This is more commonly known as Occam’s Razor or the principle of parsimony which originates from the Latin phase “lex parsimoniae” or “entia non sunt multiplicanda praeter necessitatem” which translates to “entities should not be multiplied beyond necessity.” This is a principle perhaps better known as KISS or "keep it simple, stupid."

Some examples of this principle in engineering:

As the part count increases for a design, the reliability generally decreases. For example, a bracket that uses bolts to attach it to another structure will have lower reliability if 4 smaller bolts are used rather than 2 larger bolts if each bolt has the same safety factor and hence the same probability of failure.

A design load with a 20% uncertainty used in a sophisicated computer simulation with high precision results will have no better certainty than 20%. Select the simulation method based on the level of certainty of the model parameter certainties.

Sensors have failure modes. Will a sensor used to monitor a design function improve the overall reliability when considering the sensing reliabilities?

Regression models should always be checked against first principles and limiting conditions. If they fail this review then the model and/or data is likely wrong. The simpliest regression model should always be used and in many cases will be linear.

Test data should always be used in the context of the test and measurement methods. It is very difficult to simulate the real world useage in the laboratory.

A simple test is better than no test as long as the context of the test and quality of the data are understood.

The first calculation ever made for any concept and idea should be by pencil on a single sheet of paper. Anything more is a waste of time. If the concept doesn't pass the first principle test it doesn't have a chance of passing a sophisicated computer simulation.