Podcast: Play in new window | Download
Subscribe: Apple Podcasts |
Adam, Brian, Carmen, and Jeff consider the boundaries of one’s own competence in this episode of The Engineering Commons.
- Competencies seem to be driven by past failures, notes Brian.
- For learning about “magical potions” developed with “unicorn science” (chemical engineering), Brian recommends the NurdRage channel on YouTube.
- Brian takes a cruel pleasure in seeing Mehdi Sadaghdar regularly shock himself on the YouTube channel titled ElectroBOOM.
- The “circle of competence” is a notion introduced by Warren Buffett, to encourage investors to stick to businesses they understand exceedingly well.
- Brian attempts to recall a quote from Sun Tzu’s book, The Art of War: “If you know the enemy and know yourself, you need not fear the result of a hundred battles. If you know yourself but not the enemy, for every victory gained you will also suffer a defeat. If you know neither the enemy nor yourself, you will succumb in every battle.”
- Carmen proposes that the “circle” of competence might be better imagined as an “airy disk.”
- A Merlin Mann presentation from 2014 introduces the Dreyfus model of skill acquisition. (Accompanying notes are available online for those without time to watch the video.)
- Carmen and Brian just returned from a week-long short course at the Center for Power Electronic Systems (CPES) Laboratory, located on the campus of Virginia Tech.
- Jeff mentions a book by Walter G. Vincenti, What Engineers Know and How They Know It: Analytical Studies from Aeronautical History.
- A cat’s-whisker detector is one of the oldest semiconductor devices. It places a thin wire in contact with a semiconducting material, thereby creating an elementary diode.
- The American Association of Engineering Societies (AAES) has created an Engineering Competency Model.
- The gang discusses a web page located by Jeff, titled Ultimate Cognitive Bias Survival Guide.
Thanks to Ghost of Kuji for providing the photo titled “event horizon.” Opening music by John Trimble, and concluding theme by Paul Stevenson.
Podcast: Play in new window | Download
Subscribe: Apple Podcasts |
In this episode, Chris and Jeff discuss the role of failure in advancing engineering knowledge.
- All things fail at some point. Engineers advance their own knowledge, and that of the profession, by analyzing these failures.
- As a guideline for our discussion, we reference the book, “To Engineer is Human: The Role of Failure in Successful Design,” authored by Duke University professor Henry Petroski.
- Well-known engineering failures include:
- Electronic failures, such as the XBox red ring of death, don’t usually endanger human life.
- Chris makes the case that improved tools (CAD, FEA, etc.) and methodologies (six-sigma) have served to reduce the number and frequency of engineering failures.
- Jeff counters that good tools don’t necessarily produce good results.
- Even with powerful tools for analysis, engineers can be surprised by black swan events.
- A 2009 report card from the American Society Civil Engineers gives the nation’s infrastructure a grade of “D.”
- Failure often teaches lessons that cannot be learned from textbooks.
- A single problem denotes an engineering failure, while an absolute engineering success requires a complete lack of problems.
- Chris has been working on mean time between failure (MTBF) calculations.
- Because of the wide number of possible avenues for engineering failure, it is important that engineers be open to outside review of their work, and to reviewing the work of others.
- It is important that engineers remain humble when designing a complex system.
- Chris and Jeff discuss whether engineers should consider themselves the most likely source of design errors.
- Innovative design requires stepping outside the security of known procedures and methods.
- A myriad of options are available when designing a system, but the number of unexpected interactions go up with system complexity.
- We learn about the nature of a design problem by iteratively moving from design concept to analysis, then back to concept as we discover what works and what doesn’t work.
- Keeping track of “bugs” is an important part of improving a design.
- Safety factors for aerospace design may be in the range of 1.2 to 1.4, while elevator cables are designed with a safety factor of 11.
- Testing is an important part of reducing uncertainty.
- Accelerating failure can be a competitive advantage.
- Construction of the Crystal Palace is given as an example of engineering success, housing the Great Exposition of 1851 in London, England.
- Joseph Paxton, designer of the Crystal Palace, was inspired by the shape and structure of giant lily pads.
- Accepting the concerns of critics, Mr. Paxton conducted public testing of his structures.
- Metal fatigue problems caused several crashes of the de Havilland Comet aircraft. This was a case where state-of-the-art analysis proved insufficient.
- Learning from failure is an important part of the engineering profession.
Photo of the Challenger explosion provided by NASA. Podcast theme music provided by Paul Stevenson
Practical insights for the engineering crowd