Fly By Wire Air is a one-stop shop for the aviation enthusiast. You will find aviation apparel, RC hobby planes, items for the historic aviation buff and even products and services for amateur pilots. We hope you will enjoy visiting our site. When you think of flying – Fly By Wire.
Pretend you’re the pilot of a large jetliner. You’ve completed pre-flight checks, both inside and outside, and are ready for takeoff. As you climb, the plane begins to vibrate and then pitch to one side. The number two engine then separates and you are faced with a decision – jettison the remaining fuel on the aircraft or make a heavy landing with fuel on board. While engine separations are not frequent occurrences of air travel, they can have tragic consequences for both the plane and the surrounding area. During the course of this blog, we’ll review two key cases involving such incidents.
In May 1979 a McDonnell Douglas DC-10 (Flight 191) was making a regularly scheduled passenger flight from O’Hare International Airport in Chicago to Los Angeles International Airport. Moments after takeoff, the aircraft plummeted downward, killing all 258 passengers along with the crew of thirteen and two on the runway. A subsequent investigation by the FAA revealed the number one engine separated from the left wing, flipping over the top and then landing on the runway. During the separation from the wing, the engine severed several hydraulic lines which locked the leading edge wing slats into place, as well as damaging a three-foot section of the wing. As the plane began to climb, it experienced a state of uncontrolled aerodynamics, in which the left wing provided minimal lift compared to that of the right wing while the engine was at full throttle. This condition caused the aircraft to roll abruptly to the left, reaching a bank angle of 112 degrees before crashing.
While the cause of the DC-10 engine loss was later determined to be due to a damaged pylon structure connecting the engine to the wing, several other factors also played a role in the crash. The hydraulic system powered by engine number one actually failed but ran from motor pumps connecting it to the engine three systems. While hydraulic system three was also damaged, it continued to provide pressure until the crash in spite of leaking fluid. Electrical problems were also a factor in the crash of Flight 191. The number one electrical bus, attached to the number one engine, failed, resulting in several electrical systems going offline including the flight captain’s instruments, stick shaker and wing slat sensors. As a result of the partial electrical failure, the flight crew only received a warning about the number one engine failure – not its loss. Though the crew had a closed circuit television screen behind the pilot from which to view the passenger compartments, it too was subject to the loss of power from the engine. After the Flight 191 incident and three other DC-10 crashes during the 1970s, a number of major airlines began to phase out the DC-10 in the early 1980s in favor of newer and more fuel- efficient jetliners such as the Boeing 757 and 767. While the phaseout had more emphasis on fuel efficiency, the safety of the aircraft cast a cloud over its service.
The DC-10 wasn’t the only wide-body jet to experience engine separation. In October 1992 an El Al Israeli Airlines Boeing 747-200 cargo plane (Flight 1862) with three crew members and one passenger on board, began a flight from John F. Kennedy Airport , New York to Ben Gurion International Airport, Tel Aviv with an intermediate stop at Schiphol Airport, Amsterdam. Weather conditions were favorable at the time of departure with all pre-flight checks performed, with no defects found. About ten minutes out of Schiphol, the flight data recorder indicated both engines 3 and 4 and their connecting struts had left the aircraft. The co-pilot transmitted an emergency call to Schiphol, requesting a return to the airport. However, the aircraft could not make a straight-in approach, due to both altitude and proximity to the airport. Therefore, the air traffic controller had to vector the El Al plane back to the airport by flying a pattern of descending circles to lower the altitude for a final approach. About five minutes into the flight pattern, the flight crew informed the controller of the loss of engines three and four and were beginning to experience flap control problems. The controller directed a new heading to the flight crew, but noticed the plane was taking 30 seconds to change headings. About three minutes later, the flight crew informed air traffic control they were receiving audible warnings indicating a lack of control and low ground proximity. Approximately twenty-five seconds later, the aircraft crashed into an eleven-story apartment building, about seven miles from Schiphol Airport.
Both number 3 and number 4 engine struts were recovered from Naarden Harbour, just east of Amsterdam with both engines attached to the struts. Remaining parts of the aircraft were located within a thousand foot radius of the impact. From an analysis of the parts and their placement, investigators were able to determine the number 3 engine separated first, traveling in an outboard direction, striking engine 4 and causing it and the supporting strut to separate from the plane. The engine struts or pylons are designed as two-cell torque boxes absorbing vertical, horizontal and torsional thrust loads to the wing, acting as an aerial shock absorber. The Boeing 747 pylon was supported internally by five fuse pins, which provide enough strength to hold the pylons in place with the exception of extreme loads, in which the pins fail, allowing the engine to break away without damaging the wing fuel tanks. This philosophy was adopted by Boeing from experiences with the earlier 707 and 727 models, in which a number of incidents of both in-ground and mid-air engine separations occurred. The crash of the El Al jetliner was attributed to a failure of a center fuse pin in the number 3 engine strut. The pin cracked due to metal fatigue and was a bottle bore design. The FAA issued a directive in 1979 requiring airlines to conduct inspections of the fuse pins every 2,500 flight hours as the bottle design was prone to fail at that point. The El Al 747 was one of a few aircraft which had not replaced their bottle pin units. As a result of the El Al crash and two other 747 crashes, the FAA mandated a retrofit of all Boeing 747 wing struts in 1995. The new strut design offered increased protection in the event of an engine separation, while still using fuse pins to protect the wing tank from damage during ground impact.
As the two previous cases indicate, engine separations may result from a number of problems. Sometimes it’s a matter of faulty parts, while lack of proper maintenance plays a role in others. The overall design of the aircraft itself may be a factor. However, the safe operation of an aircraft requires a continual interplay of aviators, air controllers, engineers and the flying public to promote flight safety.
In late 1945 the USAAF was at a crossroads. While the B-29 Superfortress was a capable platform in carrying the war to Japan, future requirements dictated an aircraft of intercontinental range, in excess of five thousand miles. The Convair B-36 Peacemaker met this requirement, but would not enter service for three more years. Further complicating matters, General Curtis LeMay and several other forward thinking generals were considering a jet powered bomber. However, within a few years, the generals and engineers got together and designed a truly great jet bomber – the Boeing B-52 Stratofortress. During this blog we will tell the story of the B-52, its development and its long service record with the USAF.
In addition to the range requirements of the aircraft, other performance characteristics specified by the Air Material Command in 1946 were a cruising speed of 300 mph. at an altitude of 34,000 ft., with a minimal payload of 10,000 lbs with five or six 20mm. gun turrets. The AMC issued bids later that year with Boeing, Glen L. Martin and Consolidated Aircraft submitting proposals. The Air Force accepted the Boeing proposal, an aircraft powered by six turboprop engines with a range of 3,110 miles. The Boeing plane, designated Model 462, was a straight-winged aircraft with a gross weight of 360,000 pounds – a heavy plane for its day. As a result of the weight issue, the Air Force began to have doubts about the ability of the aircraft to successfully perform its mission. Boeing then offered a smaller follow-up design, Model 464, having four engines and a 230,000 pound gross weight. While the 464 aircraft was deemed acceptable, the Air Force changed its requirements within a few months to a plane having a 400 mph cruising speed, with a 300,000 pound gross weight. Additionally, the Air Force wanted an aircraft with a range of twelve thousand miles, capable of delivering a nuclear weapon. These modifications increased the gross weight of the plane to 480,000 lbs.
Boeing responded by proposing two bombers, Model 464-16 and Model 464-17. Both planes were four engine turboprop designs, with the Model 16 being a nuclear only aircraft carrying a ten thousand lb. payload. The Model 17 bomber was a conventional bomber, able to mount a 9,000 lb. payload. By mid 1947 the Model 17 aircraft was deemed acceptable by the Air Force, except for the range requirement. By now, designated the XB-52, the aircraft offered only marginal performance in speed and range over the Convair B-36, which was about to enter service. The Air Force then postponed the project for six months in order to evaluate its potential. After a series of intense discussions between Boeing and the Air Force, the XB-52 project was back on track in January 1948, with Boeing urged to include the latest aviation innovations in the bomber design such as jet engines and aerial refueling. In May 1948, jet engines were substituted for turboprops which satisfied the Air Force. However, the Air Force still wanted a turboprop design, since jet engines of the era lacked fuel efficiency. October 1948 proved to be a crucial month for the XB-52 project. Boeing engineers George Schairer, Art Carlsen and Vaughn Blumenthal presented a refined turboprop design to Colonel Pete Warden, Director of Bomber Development for the USAF. After reviewing the proposal, Warden asked the Boeing design team if they could prepare a proposal for a four engine turbojet bomber. The following day Colonel Warden scanned the design, requesting an improved version. After returning to their hotel room, Schairer, Carlsen and Blumenthal were joined by Ed Wells, Boeing Vice President of Engineering, in addition to two other Boeing engineers, Bob Withington and Maynard Pennell. After eight hours of intense deliberation, the Boeing team had designed an entirely new airplane. The new concept of the XB-52 had 35 degree swept wings, based on the B-47 Stratojet, with eight engines paired in four pods below the wings with bicycle landing gear and outrigger wheels underneath the wingtips. The XB-52 also had flexible landing gear, which could pivot 20 degrees from the aircraft centerline to compensate for crosswinds upon landing. Warden approved the design the following week and the Air Force signed a contract with Boeing in February 1951 for an initial production run of 13 B-52As.
When the B-52 entered service in 1955, it was assigned to the Strategic Air Command (SAC) to deliver nuclear weapons under the doctrine of massive retaliation. Carrying a 50,000 lb. payload coupled with the capability to fly nearly half way around the globe, the Stratofortress was ideally suited for its role and soon became the standard for future bomber aircraft. Three B-52s from March AFB set a record around the world flight in 1957. However, it had its share of teething troubles, as with all aircraft. For example, the split level cockpit had climate control problems, while the pilot and co-pilot had sunlight exposure on the upper deck, the navigator and observer nearly froze on the lower deck. Early B-52 models were often grounded due to both electrical and hydraulic issues, with the Air Force assigning contractor teams to B-52 bases, troubleshooting problems as they arose.
By the late 1950s, advances in Soviet surface to air (SAM) missile capabilities brought about a major upgrade in the electronic countermeasure capabilities of the B-52. This situation also caused SAC to change its philosophy from high altitude bombing to low level penetration. The switch to low altitude bombing required a number of modifications to B-52 component parts. Such features as an updated radar altimeter, structural reinforcements, modified equipment mounts, an enhanced cooling system, as well as terrain avoidance radar were necessary to support missions flown at altitudes as low as 500 ft. By the end of the decade, B-52 capabilities increased with the addition of the Quail and Hound Dog missile systems. The Quail, a decoy missile, was carried in the aft bomb bay of the B-52 and launched while in flight to the target. The missile was programmed by the crew to match the speed and altitude of the B-52, thus confusing Soviet radar. Each Stratofortress carried four of these, in addition to the regular nuclear payload. North American’s entry, the AGM-28 Hound Dog was an offensive missile launched from the B-52 to carry a nuclear warhead to its target. With a mach 2 speed and an altitude variance of from 500 to 60,000 ft., the Hound Dog was able to penetrate enemy air defenses to a range of 600 miles. The primary drawback of the Hound Dog was its weight. At 20,000 lbs. each, the B52s could only carry two of them with a corresponding fifteen per cent loss of range.
The 1960s saw a change of doctrine for SAC. With the emergence of both land-based intercontinental ballistic missiles (ICBM), as well as sea-launched (SLBM) missiles from submarines, the manned bomber force became a leg of a nuclear triad. The primary advantage of the missile legs were their relative invulnerability to enemy attack. They were also cheaper to operate than a manned bomber fleet. Both ICBMs and SLBMs offered a quick response to an enemy attack, while a response from manned bombers was more time sensitive. The growing threat from Soviet ICBMs was another factor countering the effectiveness of the manned bomber leg. Due to the potential for conflict in Berlin, Cuba and a number of third world countries, the Kennedy Administration decided to scrap the policy of massive retalation, replacing it with the doctrine of flexible response. Instead of having a large nuclear umbrella with small conventional forces, those forces were increased in order to keep any potential war from escalating to the nuclear threshold. Under the flexible response doctrine, nuclear weapons were to be used in a limited role against selected targets. Thus, the B-52 had a new mission, to loiter on patrol at the edge of Soviet airspace, ready to strike designated targets in a retaliatory role. The Stratofortress was the ideal plane for the job, having the range, speed and payload, as well as an aerial refueling capability.
While the B-52 was designed as nuclear weapon delivery system, it served an entirely different purpose in Viet Nam. In 1964 seventy-four B-52s were modified with external bomb racks, which could carry an additional twenty-four 750 lb. bombs. The following year Operation Rolling Thunder began, in which the USAF commenced bombing missions in both North and South Viet Nam, with the primary role of the Stratofortress to support ground operations in the South. The first mission, Operation Arc Light was conducted by B-52s in June 1965, bombing a suspected Viet Cong stronghold in the Ben Cat District in South Viet Nam. Twenty-Seven B-52s participated in the raid, bombing a one mile by two mile box. Though only partially successful, the raid proved the potential of the B-52 as a ground attack weapon. Later that year, a number of B-52s underwent modifications to increase their capacity for carpet bombing. These raids were devastating to anyone in or near the target areas. B-52s bombed North Viet Nam in late 1972 during Operation Linebacker II. These missions were successful in leading to the peace talks which ended the war, although at a loss of 15 Stratofortresses. During that campaign, B-52 gunners claimed two North Vietnamese Mig-21s – the first hostile aircraft shot down by the plane.
The Stratofortress went on to provide ground support in Operation Desert Storm in 1991, Operation Allied Force in Serbia in 1999, Operation Enduring Freedom in Afghanistan in 2001, as well as Operation Iraqi Freedom in 2003. During its career, the B-52 has proven itself both a durable and an adaptable plane, receiving a number of modifications during its 63 year career. It has dropped bombs, launched missiles, served as an experimental platform, in addition to launching the X-15 rocket plane. Current efforts by Boeing to re-engine the Stratofortress are projected to extend its service life through 2040. One could say of the B-52, it’s the plane that keeps on flying.
During the last five years, the use of and uses for drones have increased exponentially. In this blog, we’ll trace the employment of drones in a number of industries.
While much of the current drone technology isn’t new, recent investments in both capital and technology have made drones a practical tool in a number of industries. The agricultural sector is one in which drone applications are on the rise. With the global population projected to reach about 9 billion by 2050 and agricultural consumption to increase by 70 per cent during the same period, the use of drones in agriculture has the potential of revolutionizing that sector of the economy. Such drones are high-tech systems which perform many tasks a farmer can’t, such as conducting soil scans, monitoring crop health, applying fertilizers and water, even tracking weather and estimating yields, as well as collecting and analyzing data. With the FAA currently streamlining regulations for agri-drone use, the market for such systems has the potential for approximately 80% of all drones produced, according to a recent study by Bank of America Merrill Lynch.
A number of construction companies are exploring the possibilities of utilizing drones or UAVs (Unmanned Aerial Vehicles) in that industry. Drones have a number of roles in the construction industry: among them are marketing, surveying, inspection, progress reporting, safety and monitoring workers at multiple sites. In the survey role, drones allow contractors to get detailed information about a job site, as well as conditions on surrounding properties. While site surveyors are necessary in some situations, drones can perform essentially the same function at a fraction of the cost. In the realm of construction inspection, drones offer a high degree of flexibility. For example, drones can effectively scan the roof of a skyscraper, revealing any possible construction faults. They are also useful at sites such as tunnels and bridges, which may be inaccessible from the surrounding land. The contractor can even use the drone to compare the construction to the actual plans of a project. Drone photography can be utilized to show aerial views of a site from different angles to determine feasibility of construction. These photos can be sent to a number of potential contractors during the bid process. The same capability is also useful to show job progress to developers, who may not be able to visit the site on a regular basis. Finally, drones provide a means of monitoring the safety of workers at multiple sites, keeping the contractor informed of any safety issues on a real time basis, requiring a fraction of the manpower and cost of on site supervisors.
Drones also have potential in the commercial sector. For example, Wal Mart is currently utilizing drones comparable to those used in agriculture to scan warehouse inventory, checking for missing or misplaced items. Drones flying through a warehouse are able to complete an inventory in a day – a task that would take an on site warehouse crew a month. Though in its early stages, a few major companies are using drones for delivery purposes. Dominos Pizza began a delivery service in Britain, in which a drone was able to deliver two pizzas per trip. This service has the obvious advantage of avoiding traffic jams. In Philadelphia, a dry cleaning service is using drones to make emergency deliveries of laundry to customers. Though weight restrictions are a problem, they are capable of flying a freshly cleaned suit to a customer’s front door. The latest evolution is party drones, which fly over an outdoor party, playing prerecorded music.
While drones haven’t been adopted on a mass scale, they have increased the functionality of a number of key industries, breaking through the traditional barriers. From quick deliveries, to monitoring construction progress to agriculture, drones increase work efficiency and productivity, improving customer service, safety and security – with little or no manpower. According to a recent Price Waterhouse Coopers study, drone related activity provides an economic boost of more than $127 billion globally. With the relaxed FAA flight rules approved in 2016, drone operators have more flexibility from which to operate. As it becomes cheaper to develop industry-specific drones, subsidiary niche markets will emerge. A recent study indicates the use of commercial drones could add $82 billion and 100,000 jobs to the national economy by 2025 – not bad for a young industry.
While the United States was a pioneer in aviation development during much of the twentieth century, many of its airports border on a state of decay. During the course of this blog, we’ll examine the current state of the nations airports, as well a number of proposed solutions.
Though many complain about airports, often as a result of troubled airline experiences, perhaps comparing major air hubs in the United States to their more modern overseas counterparts is unrealistic. Each airport has its own unique history in relation to the communities they serve. Aviation development in the US increased dramatically after World War II with airport construction complementing that effort. Many of the prime airports in the United States were conceived in an era before the proliferation of both foreign and domestic air routes. Most airport renovation efforts over the last 30 years have involved a limited patchwork process, since many of the hubs are surrounded by urban areas – unlike the modern air hubs of Asia and the Middle East, which serve emerging markets and emphasize architecture and aesthetics over serving large volumes of passengers. For example, Dubai’s main airport covers an area of about 7 million square feet, designed to serve from 25-30 million passengers per year, while the Jet Blue terminal in JFK airport serves approximately 22 million passengers per year in an area less than 1 million square feet. Post 9/11 security and related requirements have also placed additional stress on US airports. The financial and environmental costs of airport construction often make such proposals a political liability. The ownership and control of airports in the United States, a landlord-tenant model between the airlines and the municipalities, also serves to inhibit progress.
Given the constraints on space in many urban areas, airport designers are forced to move up rather than out. In a practical sense, any airport restructuring begins with the check-in process. By placing security and check-in on separate levels, traffic flow is segregated between the two functions. Such an organization divides passengers into two categories – those who are able to check in with the aid of mobile devices, and those who use the more traditional (paper) approach and may require assistance to board their flight. Both groups must pass through security before boarding . Such an arrangement could cut pre-flight processing time by as much as 40 per cent. As mobile technology becomes more dominant, it offers air carriers both the convenience and flexibility to book flights outside the confines of an airport. Satellite check-in sites at hotels, restaurants and shopping centers allow airlines the option of verifying and staging passengers from remote locations, requiring less staff and processing time. Delta airlines, for example , has set up its own security service at a major airport from which to process passengers. This concept provides both security and marketing benefits.
A recent trend in airport check-in procedures is the use of self-service technology. Miami International Airport purchased approx. 45 automated kiosks, to reduce customs and immigration processing time. These automated kiosks can process a passenger within two minutes, making what was once a grueling check-in process a relatively seamless one. Several major air hubs are embarking on outside improvements to enhance the passenger experience. For example, Chicago O’Hare began a $15 billion capital investment program in 2005, transforming the current system of intersecting runways to a series of parallel ones, which will increase capacity by 60 % while substantially reducing delays. An additional control tower, runway and cargo center are under construction at O’Hare and are slated to be operational in about three years. Los Angeles International began an $8.5 billion expansion program in 2006, with construction completed on the New Tom Bradley International terminal in 2013 with new dining, gates and retail areas designed to meet the needs of international tourists. Related projects include the updating of Terminal 6 to meet the needs of large scale aircraft, such as the Airbus A380. LAX is also building a new Central Utility Plant, as well as taxi and runway improvements.
Understanding how the above innovations affect terminal operations will be the key to the future success of the nations airports. As air traffic continues to grow despite economic and other setbacks, passengers will continue to demand more control over their travel experience. Airport planners must continue to emphasize key passenger services such as transit, parking and baggage claim to remain competitive, while focusing on the core mission of airports as gateways to the world.
After my grandson flew his newly purchased quadcopter a few weeks ago, I was stunned by the quality of video produced by its camera. During the course of this blog, we will trace the development of compact cameras, as well as their effect upon the radio control models.
The evolution of drone cameras began in 1901, when renown photographer George Lawrence conceived the idea of attaching a camera to a balloon to take photos of banquet halls and outdoor ceremonies. Lawrence developed a panoramic camera with a relatively slow shutter speed, which proved idea for area photographs. While his first balloon pictures were a success, both he and the balloon crashed, with Lawrence surviving a 200 ft. fall without injury. He then developed a camera platform using a series of kites connected by bamboo shafts to support the weight of the camera and ran a steel piano wire from the ground up to carry the electrical current that would trip the camera shutter. The photos were retrieved by parachute. This system was so successful, Lawrence used it to photograph San Francisco after the 1906 earthquake – from which he earned $ 15,000.
However, it wasn’t until the advent of digital technology in the 1970′s, which allowed photography to become more adaptable, that compact cameras became feasible. A digital camera is a hardware device, which takes pictures like a conventional camera, but stores the image to data instead of printing it to film. Most digital cameras are now capable of recording video in addition to taking photos. Perhaps the earliest precursor to digital photography occurred in 1957 in which Russell Kirsch, a pioneer of computer technology, developed an image scanning program utilizing a rotating drum to create images, the first scanned image a picture of Kirsch’s son. By 1969 a charge-coupled semiconductor was created by ATT Bell Labs, in which a semiconductor was capable of gathering data from photoelectric sensors, then transferring that charge to a second storage capacitor. Analog data could be transferred from a light sensitive chip, which could be converted into a digital grid, producing an image. In 1974 Bell Laboratories developed a charge transfer system, which could store and transfer charge carriers containing pixel data in serial order. This system was further refined by Bell in 1978, in which a charge transfer imaging device was produced using solid state technologies. This system was both more cost-effective, as well as preventing smearing aberrations created by similar image capture devices.
In 1973 Eastman Kodak took a gamble and hired Steve Sasson, a young electrical engineer. Sasson was one of a small cadre of electrical engineers employed by Kodak, a company well known for its chemical and mechanical engineering projects. Sasson was directed to capitalize on the capabilities of a charge-coupled device created by Fairchild Semiconductor, which could transmit and store images of 100 by 100 pixels. In 1975 Sasson completed a prototype camera incorporating a charge-coupled device, adapting a lens from an eight millimeter film camera, an analog -to-digital converter from a Motorola digital voltmeter, and a digital-data cassette recorder for storing image data. With this combination, Sasson and other Kodak technicians could capture an image and record it to a cassette in a mere 23 seconds.
By 1990 several companies began to enter the digital photography market, creating a new segment for consumer cameras. The first digital camera ready for sale in the US market was the Dycam Model 1, which came out the same year. The Model 1 was capable of recording images at a maximum resolution of 376 pixels by 240 pixels. Two developments in the 1990′s further enhanced the marketability of digital cameras. The first was a codec, utilized for image compression, the precursor to the JPEG image file format of today. This system exponentially increased the storage capacity of digital cameras over prior magnetic tape and floppy disc storage systems. By the mid 1990′s Apple began to market the Quick Take 100, the most widely marketed digital camera in the United States. The Quick Take had a maximum resolution of 640 by 480 pixels and could store up to 24 images in 24 bit color. In 1995 Casio released the QV-10, the first consumer digital camera to include a to include a liquid crystal display (LCD) screen, which quickly allowed camera owners to review newly photographed images. Other developments in the 1990′s included a pocketable imaging device with an LCD screen capable of displaying images from a camera storage device, as well as a single- lens digital reflex camera, which could reproduce camera images in 35mm film quality. By the end of the decade, digital cameras had a resolution of 2,000 pixels by 2,000 pixels.
In the early 2000′s a merging of digital camera and lithium polymer battery technologies took place. In the latter case, the flexible polymer battery began to deliver near gas engine performance with the attributes of less weight and volume on the rc model frame. Digital cameras were now both lightweight and efficient, capable of both still photos and video covering a relatively wide area. By 2010 a number of drones and smaller quadcopters carried flash drive units, which could be inserted into the rc model camera to record flight video. Once on the ground, the rc pilot would then insert the drive unit into the USB connection of a personal computer, playing the video of the quadcopter flight on the computer monitor screen – a far cry from pulling piano wire to trip a camera shutter.
From the journeys of the Apostle Paul to the twenty first century, missionaries have been on the move, proclaiming the gospel as well as meeting the physical needs of the communities they serve. During the course of this blog, we will trace the development of mission aviation from its earliest days to its global reach of today.
While missionaries were flown into Central America and the Caribbean region as early as the 1920′s, it wasn’t until after World War II that mission aviation developed into its own unique ministry. One of the the first air ministry organizations was the Mission Aviation Fellowship. The MAF was formed in 1946 as a result of several World War II aviators who envisioned a role for aviation in spreading the gospel. The Mission Aviation Fellowship was initially established from three branches, with Jim Truxton of the United States, Murray Kendon of the the United Kingdom and Edwin Hartwig of Australia. The earliest MAF efforts were in Mexico, Peru and Ecuador with Betty Greene flying two Wycliffe Bible translators to a remote location in Mexico in 1946. By 2010 the MAF supported missionaries in 55 countries, transporting over 200,000 passengers, meeting global mission and humanitarian needs with 130 aircraft.
As a result of the increased global outreach of the Missionary Aviation Fellowship and other aviation ministries, a need for pilot training programs became evident. In 1975 the Mission Aviation Training Institute (MATI) was formed. Upon retiring from the Air Force, Davis Goodman was approached by the President of Piedmont Bible College to establish a flight training program for missionaries under development by the college. Flight training began the prior year, with a single instructor, a borrowed aircraft and nine students at a local airport. Later in 1975, Davis became the program director and purchased a Cessna 150 dedicated for training purposes. Within four years, the program leased space at a larger airport, followed by the addition of an Airframe and Powerplant Mechanic School in 1981. In 1984 Goodman ceded both ownership and operational control of Sugar Valley Airport and MATI (now Missionary Aviation Institute) to Piedmont Baptist College. With more pilots than planes for mission efforts, Goodman founded Aviation Ministries International (AMI) in 1984 with the primary tasks of fundraising and aircraft acquisition. By 2015 AMI (now Missionary Air Group) was providing both mission and medical services to outlying areas in more than a dozen countries.
With the steady growth and progress of mission aviation over the past seventy years, as well as improvement in transport systems in underdeveloped areas, some have questioned if mission aviation is relevant. However, when one considers the perspective of a pilot, a different picture arises. While the major cities of the world are easily accessible by jetliner, reaching remote local areas remains a problem. Transportation is not uniform within many of these countries with highways turning into back roads within a fifty mile radius of urban areas. A journey of a few hours by plane could take a day on foot. Secondly, roads are actually disappearing in some of the remote areas of the world. For example, in a number of African countries, when one could travel across the country in a couple of days, is nearly impassable today with bridges and roads in disrepair being replaced by jungle growth due to political instability and inadequate funding. Also, in many instances air transport remains a cost-effective means of travel. A mission organization in Brazil chartered a motorized canoe for a trip up the Amazon river only to find out they could have chartered a Cessna 206 float plane for an identical rate. National aviation organizations now exist fully staffed and funded by local mission groups. The Asas de Socorro in Brazil manages five bases along the Amazon in addition to operating a flight school in Anapolis, training students from other Latin-American countries. Finally, mission aviation remains the most flexible and responsive tool to reach otherwise impassable areas. In Morocco, where mission work has thrived for years along its populated coastal cities, the Berber tribesmen of the Atlas Mountains remain without a church due to the ruggedness of the terrain and relative isolation.
While watching the recent movie Sully, I was amazed at the sophistication of current flight simulators available to the major aircraft producers. During the course of this blog, we will trace the development of flight simulators from mere mechanical devices to the virtual reality electronics of today.
A flight simulator is a mechanical or electronic device, which attempts to duplicate both aircraft flight and the environment in which it flies. Current simulators can replicate factors such as flight controls, wind, moisture and electronic system interaction. While flight simulation is used primarily for pilot training, it may also be used to design aircraft, as well as identify effects of aircraft properties.
The earliest flight simulators were used during World War I to teach gunnery techniques. This involved a static simulator with a model aircraft passing in front to aid both pilots and gunners to develop correct lead angles to the target. This was the only form of flight simulation for nearly ten years. The Link Trainer, developed by Edwin Link in the late 1920′s, capitalized on the use of pneumatic devices from player pianos and organs from the family musical instrument business. The first trainer was patented in 1930 with an electrical suction pump boosting the various control valves operated by stick and rudder action while another motor simulated the effects of wind and other external disturbances. These actions could be manually adjusted to provide a variety of flight characteristics.
While the Link Trainer provided a quantum leap in capability over previous flight simulators, many in both the military and civil aviation communities believed the live flight experience offered a better training environment. However, by the early 1930′s, the United States Army Air Corps had a need for flight simulator applications which could train mail pilots to fly by instruments for long distances. An enhancement to the Link Trainer was a device called the course plotter, in which a self-propelled tracker could remotely trace the trainer position from an inked wheel with communications between pilot and instructor facilitated by the use of simulated radio beacons.
It was during the late 1930′s, when flight simulation began to be based on electronic applications. The Dehmel Trainer, developed by Dr. R. C. Dehmel of Southwestern Bell, coupled a Link Trainer with an advanced radio simulation system, which could accurately duplicate navigation signals transmitted to a receiving aircraft, providing a state of art simulation of radio navigation aids. The Aerostructor, developed by A. E. Travis, utilized a fixed base trainer with a moving visual presentation, as opposed to radio and electronic signals. This presentation was based on a loop of film which depicted the effects of course changes, pitch and roll. While the Aerostructor was never mass produced, a modified version of it was in service with the US Navy.
During World War II advances in aircraft design such as retractable landing gear, variable pitch propellers and higher speeds created a demand for more realistic forms of flight simulation. In response to this, the Hawarden Trainer was developed, which used a cutaway center section of a Spitfire fuselage, which allowed training in all aspects of operational flight. In 1939, the British were in need of a simulator which could train it’s navigators who were ferrying US aircraft across the Atlantic. The navigator was supported by a number of radio aids, as well as a celestial dome corresponding to changes in the position of the stars relative to changes in time, longitude and latitude. The Celestial Trainer, designed by Ed Link and P. Weems was also modified to train bomber crews, in which simulated landscapes gave the bomb aimer target sightings as they would appear from a moving aircraft. Redifussion (Redifon) produced a navigation device in 1940, which simulated existing radio direction equipment allowing two stations to take a fix on an aircraft’s position. By the end of the war, aircraft crews were trained by the simulation of radar signals to acquaint them with new types of radar developed during the war.
While the science of flight simulation had progressed dramatically over the past thirty years, they were unable to accurately duplicate performance characteristics of a plane. This changed with the arrival of subsonic jetliners in the 1950′s. Aircraft manufacturers began to produce more complete data and extensive flight testing. This data was stored on analogue computers, making the data transferable, but requiring more hardware as aircraft testing became more sophisticated. By the early 1960′s, digital computers began to replace the aging analogue units due to the increased data capacity and speed of the digital units. The most successful of these, the Link Mark I, operated with three parallel processors functional, arithmetic and radio selection, using a drum memory for data storage. By the 1970′s the majority of computer systems could be adapted for flight simulation.
During that decade computer image generation or CGI technology became available for flight simulation models. This technology, adapted from the space program, used a ground plane image, supplemented by three dimensional graphics. This technology became more sophisticated in recent years, mating it to advances in digital computers – a far cry from the rolling ground plane pictures of the 1940′s. Today, flight simulation is a colossal industry, spanning the globe with a wide range of high tech applications for both aircraft users and producers, enhancing the safety of both crew and passengers.
When one considers prominent German-Americans, names such as Eisenhower, Nimitz, Kaiser and Kissinger come to mind. However, another German-American, not often cited, may leave perhaps a greater legacy.
William E. Boeing was born in Detroit, Michigan in 1881 to Wilhelm Boing from Hagen-Hohenlimburg Germany and Marie M. Ortmann from Vienna, Austria. The senior Boeing was a mining engineer, who became wealthy as a result of holdings of timber lands and mineral rights near Lake Superior. After study abroad in Switzerland, Boing added an e to his name, to make it sound more Anglo. He then entered Yale, but left before graduating to join the family timber business in 1903. Buying a large tract of forest on the Pacific side of the Olympia Peninsula in Washington, Boeing began building boats as well as acquiring several lumber operations.
During a business trip to Seattle in 1909, Boeing saw his first plane and soon developed a keen interest in aviation. Within a few months, Boeing was taking flying lessons at the Glenn L. Martin Plant in Los Angeles and had ordered a Martin TA Hydoraeroplane. Martin even sent one of his test pilots up to Seattle to give Boeing lessons on site. When the test pilot crashed the aircraft during a test flight, he informed Boeing replacement parts would not be available for months. The problem frustrated Boeing, who had just received his pilot’s certificate. After studying both the plane and the parts distribution at Martin, Boeing approached a friend of his, Commander George Conrad Westervelt, USN. When Boeing suggested to Westervelt that they could build their own plane in less time, Westervelt agreed and they formed their own aircraft company – B&W. Their first aircraft, the B&W seaplane was an instant success with Boeing purchasing an old boat factory on the Duwamish River outside Seattle.
When the United States entered World War I, Boeing and Westervelt received a government contract for fifty of the B&W seaplanes, with Boeing changing the name of fledgling company to Pacific Aero Products Company. By the end of the war, Boeing began to emphasize commercial aircraft, in addition to providing a government sponsored air mail service.
The air mail service was a result of the commercial aviation market flooded with surplus World War I aircraft, which were relatively inexpensive compared with the cost of new models. Boeing had to diversify at this point, selling furniture, and a series of flat-bottomed boats called sea sleds. Within a few years, Boeing began to realize a profit from the overhaul of government aircraft and the sale of a few new models. During the 1920s and early 1930s, Boeing would become a major producer of fighter planes for the Army Air Corps.
In 1925 federal law allowed public bid for air mail contracts. Boeing received the contract, but needed a fleet of twenty six planes to serve the Chicago to San Francisco route by July 1, 1927. As a guarantee, Boeing drew $500,000 of his own money to serve as a bond for the effort. These aircraft were composed of Boeing’s latest design, the Model 40, which had an open cockpit for the pilot with an enclosed cabin for two additional passengers. The mail service proved to be an unexpected market coup for Boeing, allowing him to haul passengers for a fee and start a new airline, Boeing Air Transport. It wasn’t long before Boeing cornered the market in both aviation sectors.
In 1929 Boeing acquired Pacific Air Transport, merging it with both the Boeing Airplane Co. and Boeing Air Transport. The new company was named United Aircraft And Transport Company. Later the same year, United purchased both the Pratt&Whitney engine and Hamilton Standard Propeller companies, as well as Chance Vaught Aircraft. To expand its airline service, Boeing acquired National Air Transport the following year.
By 1934 Boeing’s success began to draw the attention of the federal government. In June of that year the Air Mail Act was passed by Congress, by which aircraft manufacturers had to divest themselves of any airline services. As a result of this split, Boeing’s holdings were formed into three companies: United Aircraft Corporation, which manufactured aircraft in the eastern United States (now United Technologies Company), Boeing Airplane Company, manufacturing aircraft in the western United States and United Airlines, which served the air routes.
A week after the Air Mail Act was passed Boeing resigned as chairman and sold his stock in the firm. However, shortly after his resignation, William Boeing received the coveted Daniel Guggenheim Medal for achievement in the field of aviation. During World War II, he came out of retirement to act as an advisor to the company to meet the demands of combat aircraft development. The company he started in 1916 went on to develop such influential aircraft as the B-17 Flying Fortress, B-29 Superfortress, B-47 Stratojet and B-52 Stratofortress. Boeing produced an equally impressive series of airliners, starting with the Stratoliner in 1939, the world’s pressurized airliner, the jet powered 707, 727, 737, and the Boeing 747, the world’s first Jumbo Jet. A recent first for Boeing was the successful development and production of the 787 Dreamliner, the first jetliner in service made of carbon-fiber materials. Boeing is now involved in the space technology sector, in addition to the production of aircraft. Not bad for someone who made the decision to build his own plane in 1916.
This article is the last of a series about the heroes of aviation.
While many of our ancestors arrived in this nation by ship – the only practical means of mass transit at the time, the subject of this blog chose a different but no less dangerous path to freedom. In his case, timing made the difference between life and death.
Kenneth H. Rowe (No Kum-Sok) was born in Sinhung, Korea on January 10, 1932. When Rowe was twelve years old, Korea was a part of the Japanese Empire and both Japanese culture and companies dominated the peninsula. Though Korean traditions and culture were officially shunned, Rowe’s father worked for a Japanese corporation and made a relatively good living, providing Ken with both material and social advantages. By his teen years, Ken could speak both Korean and Japanese fluently. In 1944 the Japanese military began sending its pilots on suicide missions against the American navy in the Pacific and requested Korean volunteers. Although Rowe was only twelve, he asked his father if he could volunteer to serve as a kamikaze pilot. The father was able to discourage Rowe, and conveyed an attitude that the United States would ultimately win the war. This aroused a curiosity in Ken about the United States and its people.
While Rowe began to express pro-American sentiments to his classmates, he had to be careful about them since the Soviets occupied Korea north of the 38th parallel after World War II and installed a Communist regime. After several years of dictatorship under Kim ll Sung, Ken became convinced he had to leave North Korea but ironically decided being an ardent Communist would give him the means to do so. Rowe’s zeal caught the attention of the North Korean military and he soon trained to become a fighter pilot.
Ken began flying combat missions in Soviet-built Mig-15 jet fighters in 1951. Although he flew nearly a hundred missions during the course of the war, he sought to avoid dogfights with USAF jet fighters, which enjoyed both qualitative and quantitative advantages. In September 1953, two months after the end of the Korean War Rowe (No) saw his chance. Rowe’s squadron was on a training mission from Sunan Air Base, just outside of the North Korean capital of Pyongyang. With near perfect flying weather, Rowe was able to veer away from from his unit and set a course for the 38th parallel into South Korea. He knew the odds were against him to land safely at an American air base, but after a fifteen minute flight Rowe landed safely at Kimpo Air Base, just outside the South Korean capital of Soul. He later discovered the USAF radar was shutdown for maintenance work that morning, though he barely missed a collision with an American jet fighter landing on the same runway from the opposite direction.
Rowe (No) spent the next six months on Okinawa as a consultant to both the USAF and CIA on the capabilities of the Mig-15, as well as providing insight about North Korean air combat strategies. Ken arrived in the United States in 1954, working as a paid contractor to a number of US intelligence agencies. During that time, he often traveled by rail between Washington DC and New York, passing through Newark, Delaware – home of the University of Delaware School of Engineering. Intent on pursuing his education, Rowe enrolled in the UD engineering program, earning degrees in both mechanical and electrical engineering. He was well situated upon graduation, with the $100,000 reward received for defecting with the Mig (of which Rowe was unaware) invested for him and yielding a high rate of return.
When Rowe sought assistance from his CIA handlers in securing a green card to work in the US, they refused. He could only get temporary visas as a result of an agreement between the CIA and the government of South Korea, who wanted him to join their air force upon graduation. From a close relationship with a history professor at UD, Ken was introduced to a Senator from Delaware, who introduced a bill granting him citizenship. The bill was eventually signed by President Eisenhower. The CIA was instructed not to interfere if Rowe sought permanent immigration status on his own.
In 1957 Ken was reunited with his mother, who had been living in South Korea. Though he wasn’t fluent in English, he quickly adapted to life in the United States. Rowe pursued a varied and successful career in aeronautical engineering, working for a number of key aviation firms such as Grumman, General Dynamics, Lockheed and Boeing, as well as General Electric, DuPont and Westinghouse. After leaving the corporate world, Rowe served as an aeronautical engineering professor at Embry-Riddle University, making him a true hero of aviation – both inside and outside of the cockpit.
This blog is the fifth of a series about the heroes of aviation.
Aircraft designers and artists share a common trait – the ability to think out of the box and incorporate new concepts into their works . While the artist strives to create a pleasing appearance out of their work, whether art or sculpture, the aircraft designer must first meet a set of performance criteria in order to produce a successful aircraft, the artistic form being of secondary importance. During the course of this blog we’ll trace the career of an engineer who designed a number of aircraft achieving both impressive performance and appearance.
Clarence Leonard “Kelly” Johnson was born in Ishpeming, Michigan on February 27, 1910. Johnson decided to pursue a career in aeronautical engineering at the age of 12, largely as a result of reading a series of Tom Swift novels. A few months later, he designed his own small plane, which he named the Merlin 1 Battle Plane. After seeing a Curtiss Jenny in flight during a local exhibition, he became interested in flying aircraft as well as designing them. During his high school years, Kelly moved to Flint, where his father had a construction business. He also worked part time in the motor test section of Buick, gaining a practical knowledge of engineering. By the time he completed high school, Kelly had saved about $300 to defray the costs of flight school. When Johnson approached the flight instructor, he persuaded him to use the money to further his education.
While Johnson was surprised at the instructor’s response, he respected him, and after holding a number of odd jobs, graduated from the University Of Michigan in 1932, receiving a Bachelor of Science in Aeronautical Engineering. After gaining a number of teaching fellowships, as well as serving as a consultant to the university, he received a Master of Science in Aeronautical Engineering the following year. Johnson’s first assignment at Lockheed in 1933 was to design tools from which to build aircraft . However, it wasn’t long before he was involved in the design of Lockheed’s first line aircraft of the era, such as the Model 10 Electra flown by Amelia Earhart. Johnson would later design the military version of the Electra, the Hudson Lockheed, for the British from a set of sketches he made from his hotel room. By 1938 Kelly was serving as an assistant to Lockheed’s chief engineer, Hall Hibbard. In 1937 the Air Corps contracted with Lockheed to produce an aircraft capable of speeds in excess of 400 mph., with nearly double the range and firepower of existing fighter aircraft. Within a year, Hibbard and Johnson designed a twin-boomed plane, a radical departure from current practice, with armament of four fifty caliber machine guns with a 20 mm. cannon in the nose, with a larger internal fuel capacity augmented by detachable drop tanks underneath the inner wing panels. The aircraft was test flown in 1939 and entered service in 1941 as the P-38 Lightning. The P-38 proved to be a versatile plane, performing a variety of missions ranging from ground attack to the night fighter role.
In 1943 Hibbard and Johnson were presented with a new challenge. Both Germany and Britain were developing fighter aircraft driven by jet propulsion, while the USAAF program efforts lagged. Another reason for a practical jet fighter was the receipt of intelligence reports in early 1943 about a German jet fighter undergoing advanced testing, the ME-262. Fearful the new German fighter would soon become operational, Lockheed was awarded the contract and Johnson promised the design would be completed within six months. Hibbard and Johnson decided to build the new jet fighter around the existing British De Haviland Goblin engine, already in use in the Gloster Meteor. Within a mere 143 days, the new jet fighter, the P-80 Shooting Star, had completed its first test flight and production began two months later. While too late to see action in World War II, the P-80 saw extensive action in Korea, in both the ground attack and aerial combat roles. Variants of the P-80/F-80 were in use until 1997.
Due to a perceived Soviet bomber threat, the CIA issued a requirement in late 1953 for an aircraft capable of scanning large segments of Soviet territory from an extremely high altitude. During the last year of the Korean War, several Convair B-36 bombers flew over Manchuria, taking pictures of Mig bases from a relatively high altitude. The large bomb bay area, long wings, and a high altitude dash capability from it’s four jet engines made the B-36 a good camera platform for its time. The proposed aircraft would not be as big, but would have long, glider like wings, coupled with a lightweight fuselage powered by a single jet engine mounted in the fuselage. The contract was awarded to Lockheed the following year and Kelly Johnson went to work. The initial specifications called for an aircraft capable of operating at an altitude of 70,000 ft. with a range of 1,700 miles. Johnson shortened the fuselage of an experimental F-104 Starfighter with long, slender wings. The design was powered by the J73 General Electric jet engine and emphasized weight saving, discarding features such as a landing gear and ejection seats. It took off from a special cart and belly landed when returning. The aircraft, designated Utility Two or U-2 , could cruise at an altitude of 73,000 ft. with a range of 1,600 miles. By 1955 the U-2 was in production and CIA operators were flying it over the world’s trouble spots the following year. These flights over the Soviet Union ended in May 1960 with Francis Gary Powers U-2 shot down by a Soviet SA-2 missile. However, the U-2 continued to serve in other areas, providing valuable intelligence during the Cuban Missile Crisis of 1962, the aircraft remaining in service for over 50 yrs.
In the 1960s, Johnson designed the successor to the U-2, the SR-71, The SR-71 was a twin jet, twin tail, delta-winged reconnaissance aircraft, capable of sustained mach 3 speeds with a service ceiling in excess of 85,000 ft. with a range of 2,900 miles. From the technology standpoint, the SR-71 or Blackbird, was a totally new design made largely of titanium, which was ironically imported from the Soviet Union at the time. The SR-71 was in service for over 30 yrs. and set a number of world speed and altitude records – many of them still standing. Kelly Johnson was instrumental in the design of some 40 aircraft during his forty plus years at Lockheed, designing a number of great planes at pivotal times in our nation’s history – making him a true hero of aviation.
This blog is the fourth in a series about the heroes of aviation.