Fly By Wire Air is a one-stop shop for the aviation enthusiast. You will find aviation apparel, RC hobby planes, items for the historic aviation buff and even products and services for amateur pilots. We hope you will enjoy visiting our site. When you think of flying – Fly By Wire.
In late 1950, as USAF B-29s were bombing North Korean supply lines in support of UN ground troops, they encountered a swept-winged North Korean aircraft capable of great speed. While the introduction of the MIG-15 caught United Nations forces by surprise, its counterpart would prove to be a legend among jet fighter aircraft.
As early as 1944, North American Aviation had proposed a jet aircraft design to the US Navy, as a result of combat against the early German jet fighters, such as the ME-262. This design, the FJ-1 Fury was, in essence, a jet version of the P-51 MUSTANG. Though its speed was impressive compared to piston-powered fighters, its overall performance failed to meet expectations. However, within a few months, the USAAF approached North American with a requirement for a medium-range, single-seat, high-altitude, jet-powered day escort fighter/fighter-bomber. In early 1945, North American submitted four designs to the Air Corps with North American granted permission to produce three examples of the XP-86 (Experimental Pursuit) aircraft. While the XP-86 was a lighter plane than the Fury, and could attain 582 mph, as opposed to 547 for the FJ-1, the XP-86 could not meet the Air Corps requirement for a top speed of 600 mph. Furthermore, two rival designs, the XP-80 and XP-84 having speeds in the range of the XP-86 were already under development and might result in cancellation of the contract for the XP-86.
North American was able to solve this problem with a leapfrog in technology. The XP-86 was the first American aircraft to take advantage of captured German test data at the end of World War II, which indicated a thin swept wing could greatly reduce drag and delay compressibility problems when an aircraft approached the speed of sound. Further study of the tests revealed a swept wing would solve the speed problem, while a slat on the wing’s leading edge would enhance low-speed stability. Since the 86 was approaching an advanced stage of development, North American’s senior management was hesitant to incorporate a swept wing design. However, after a series of wind tunnel tests, a 35-degree sweep offered the best performance with automatic front slats and an electrically adjusted stabilizer based on the ME-262. As a result of combat experience gained in Korea, the front edge wing slats were phased out in favor of a leading edge chord extending 6 inches from the wing root to 3 inches at the tip.
Though much of the design work was delayed until after the end of World War II, the first F-86 SABRE was completed on August 8, 1947, with the first flight occurring on October 1 of that year. The SABRE was first assigned to the USAF Strategic Air Command in 1949 prior to its deployment to Korea in late 1950. The F-86 set a number of speed records during its early years, an official world speed record of 671 mph in September 1948, a 1951Bendix Trophy for an average speed of 553.76 mph, as well as the first woman, Jacqueline Cochran, to break the speed of sound in May 1953.
When the Soviet MIG-15 was introduced in November 1950, it outperformed all UN aircraft, such as the straight-winged F-80 and F-84. The MIG was clearly a generation ahead of both types, as well as the F9F PANTHER, flown by the US Navy from carriers offshore. Three squadrons of F-86s were dispatched to the Far East in December 1950. Though the F-86 and the MIG-15 were evenly matched and based on similar design concepts, there were a number of differences. SABRES were more aerodynamically stable and could turn, roll and dive faster than the MIG. The F-86 could also go supersonic in a dive, while the MIG would experience structural damage attempting to do so. The SABRE was also equipped with a radar gunsight, which allowed pilots to quickly aim their .50 caliber guns more accurately-even compensating for speed. The MIG-15s key advantages were faster climbing and acceleration rates, effective handling at high altitudes and being somewhat more maneuverable. Firepower between the two aircraft was a tradeoff, with the SABRE firing more smaller rounds more accurately aimed and the MIG firing less accurate but larger bore (23mm and 37mm) ammunition. Perhaps the deciding factor in the air war over Korea was the quality of pilots. Many of the MIGs were flown by Soviet pilots for about the first year of their deployment. Many of these were aces from World War II and were thus capable pilots. The USAF followed the same philosophy, sending a number of World War II aces to Korea as well. While the Soviet pilots were well trained, the USAF training program at Nellis AFB was both more broad and intense. As Soviet pilots were rotated home, they were replaced by less capable Chinese and North Korean pilots. As the war progressed, this was reflected in the loss ratio between the two aircraft. While the overall loss ratio was in favor of the SABRE of about eight to one by wars end (78 to 687), the loss ratio against Soviet pilots has been disputed in recent years, with a number of former Soviet pilots stating a loss ratio of two to one in favor of the SABRE. The most hotly contested battles were fought over an area near the mouth of the Yalu River known as MIG ALLEY.
After the Korean War, the SABRE was exported to a number of nations to include NATO allies such as the United Kingdom, Canada, West Germany, Greece, Spain, Norway and Turkey, as well as Taiwan, Japan, Pakistan and Saudi Arabia. The last SABRE was retired from the Bolivian Air Force in 1994. Though the SABRE was in service for many years, the high point of its career was in Korea-in which a few brave pilots and planes made the difference in saving a nation.
Tragic as the events of September 11, 2001, were, they forced a needed examination of global aviation security. In this blog, we’ll look at both current problems and approaches to enhance the security of global air travel.
Since the 1970s, trade, technology, and economic growth have merged to form a state of globalization, in which the welfare of people, firms, and nations have become ever more interconnected. Concurrently, civil aviation has evolved from a heavily regulated system of government-sponsored air services and airports to an increasingly competitive global structure, in which private organizations compete with their publicly held counterparts. Global air traffic has increased exponentially over the last forty-five years, in spite of economic recessions, military conflicts, health epidemics and acts of terror. Due to the nature of its operations, civil aviation has always been a target for violent acts. The first violent incidents involving civil airliners were hijack attempts, which began in the 1960s. By the late 1970s, these were on the decline due to international treaties and plain-clothed security personnel on board the aircraft. During the 1980s bomb attacks designed to draw attention were on the rise, decreasing in later decades. By the 1990s, aviation security had evolved into a complex system combining intelligence agencies and airport security personnel coupled with electronic devices from which to detect, bombs, weapons and prohibited items.
The terror attacks of September 11, 2001 were the most graphic example of the ever-evolving threat of terror attacks against aviation. The attacks demonstrated how civil aircraft could be used as weapons to kill large numbers of civilians and destroy assets on the ground. Since that time governments have created a number of new organizations to direct airport security systems, as well as massive investments in both technology and personnel. Though both airlines and airports have faced challenges resulting from heightened security efforts, the traveling public has been willing to bear them to promote a secure travel environment.
Today, there are several factors affecting the security of global aviation. Technology is rapidly enabling the ability of terror groups and other bad players to inflict large-scale damage. While the capability for such efforts has been confined to a few major nations, such technology is now available to a number of non-state organizations. The merging of cyber and physical capabilities are creating new security issues. One only need to see a virtual reality game to understand how closely simulations can approximate real-world situations. Many systems in civil aviation such as traffic management systems, passport control system, departure control systems, hazardous materials transport and reservation systems are all vulnerable to outside hacking. Computerized aircraft flight systems pose an equally serious threat. GPS navigation systems, fuel control systems, flight control and maintenance only serve to increase the points of cyber vulnerability. As aviation becomes more computerized, human proficiency becomes less effective. Though automated systems are becoming more flexible to handle a variety of situations, minimizing human involvement. However, when humans have less opportunity to practice and develop skills, they become less capable of acting in a timely and appropriate manner when emergencies arise. Perhaps the most vulnerable points in many automated systems are those in which humans interact with automated programs.
However, a number of solutions are available to enhance global flight security. There is currently too much emphasis on molding new problems into existing regulations. As is often the case, by the time new policies are formulated, a new threat has arisen. Global aviation firms should adopt a philosophy of thinking like the terrorist, rather than relying upon yesterday’s doctrine to meet future attacks. In the realm of cybersecurity, firms must enhance their understanding of threats by testing their systems by in-house or outside consultants, tailoring their systems to meet the threats. Firms should cooperate on both cyber and physical security threats, as cooperation makes everyone stronger. Any would be hacker will always probe for the weakest link. Real and potential vulnerabilities should be shared between companies. Finally, airlines need to rethink border security, in the digital sense. While the number of remote attacks has increased in recent years, air safety is improved by a thorough knowledge of passengers – an area in which more capable programs are needed.
Civil aviation is a key element of the global economy and any event, whether accidental or intentional, has a direct bearing on the media. With new technology promoting the rapid transfer of information, it will continue to be a likely target for those who want to cause maximum disruption.
While the operation of radio control models as a hobby is a relatively recent event, the technology behind the hobby dates back to the nineteenth century. During the course of this blog, we’ll trace the evolution of rc model transmitters and their applications.
The first use of radio control technology dates back to 1898, when Nikola Tesla built a pair of radio controlled boats, demonstrating them to a shocked crowd at Madison Square Garden. Though only able to cruise a short distance, the boats showed the potential of radio control. During World War I, Archibald Low designs an aerial drone plane for the Royal Flying Corps for use as a radio-controlled guided bomb. By the 1930s practical rc planes were available to hobbyists, with Walter and William Good building and flying the first fully-functional rc aircraft in 1937. As a result of progress in radio control technology during World War II, the use of rc models increased dramatically during the 1950s, though the battery capacities were limited and needed frequent rechargings until transistors became available.
Transistors also reduced the voltage requirements of a battery, virtually eliminating the older high voltage batteries. One channel rc radio kits were introduced in the 1950s with preassembled units offered later. In both tube and transistor radios of the era, the rc planes control functions were operated by an electromagnetic escapement utilizing a rubber- band loop system from which the rc pilot could control rudder and speed functions. By the 1960s, crystal-controlled superheterodyne receivers became available. These offered a true three-dimensional control of rc aircraft (yaw, pitch and motor speed). Heterodyne receivers also provided a more sensitive signal selection coupled with a more stable rc model control. After World War II commercial rc planes control signals were limited to two or three channels using amplitude modulation. More channel choices were added in the 1960s using frequency modulation, which offered twenty or more operating channels.
The next generation of rc model transmitters was developed in the mid-1970s. A Pulse Width Modulation (PWM) Signal is a method by which an analog signal is produced using a digital source. PWM signals consist of two primary components that define its behavior: a duty cycle and a frequency. The duty cycle is the amount of time the signal is in a high (on) state as a percentage of the total time it takes to complete one cycle. The frequency determines how fast the PWM completes a cycle, and consequently how fast it switches between high and low states. By cycling a digital signal on and off at a fast enough rate, and with a certain duty cycle, the output will appear to behave like a constant voltage analog signal when providing power to devices. These signals transmitted in a rapid succession could control multiple functions on the rc model.
On the heels of PWM signal technology Pulse Code Modulation (PCM) was developed. While analog technology uses continuous signals, digital technology encodes the information into discrete signal states. With two states assigned per digital signal, they are called binary signals, while a single binary digit is termed a bit. While PCM is an efficient means of signal transmission, it is by no means foolproof due to the proliferation of radio control devices in both the hobby and industrial markets. To overcome this problem, some late model FM receivers which still use PWM coding can be modified by the use of advanced computer chips to detect the individual signal characteristics of a particular Pulse Width Modulation without needing a designated code, as required with PCM coding.
By the early 2000s, Spread Spectrum rc contol systems came into use. A Spread Spectrum rc control system uses a variable frequency of operation, usually in the 2.4 gigahertz band, in which the rc transmitter stays on a given frequency for a minimal amount of time. With the enhanced security offered by Spread Spectrum systems, an increasing number of radio manufacturers are offering the units to hobbyists at a price from $3,000 to as low as $30. A number of manufacturers are now selling conversion kits for older digital 72 MHz radios and receivers, providing even more options for the rc model operator.
In 1939 Trans World Airlines was becoming a major competitor with Pan American Airlines for the emerging overseas route service. While TWA contracted with Lockheed to develop an aircraft to rival the performance and capacity of the Boeing Stratoliner, a major stockholder of TWA requested Lockheed to build an even greater plane-one which would ultimately define both an airline and an era of aviation.
Though Lockheed had been working on the L-044 Excalibur since 1937, Howard Hughes, the majority stockholder of Trans World, requested Lockheed develop an even more capable aircraft with a forty passenger capacity and a range of 3,500 miles. The new design, the L-049 Constellation, was a radical departure from previous airliners. The tripletail configuration kept the aircraft’s height low enough to fit in existing hangars. The wing layout was similar to another Lockheed plane, the P-38 Lightning. The L-049 featured such innovations as hydraulically boosted controls and a de-icing system used on wing and tail surfaces and mounted tricycle landing gear. The Constellation had an impressive performance for its day, being able to attain a maximum speed of 375 mph. with a cruising speed of 340 mph. – faster than many fighters of the era, with a service ceiling of 24,000 ft.
While intended for use as an airliner, the L-049s which entered service for TWA in January 1943 were quickly converted to military transports with the USAAF ordering 202 aircraft. The military designation, C-69, was used primarily as a long-range troop transport. Though the C-69 was successful in its role, only 22 aircraft were produced during the war. A number remained in service with the USAF into the 1960s, ferrying relocating military personnel. Lockheed even had plans to develop the L-049 as a long-range bomber (XB-30), but the design was never pursued.
Following World War II the Constellation began its heyday. USAAF C-69 transports were completed as civil airliners with TWA accepting its first aircraft in October 1945, initiating its first transatlantic flight from Washington DC to Paris in December of that year. During the late 1940s, the Constellation was upgraded several times to increase fuel capacity and speed. Finally, in early 1951 the Super Constellation was introduced. The Super Connie was extended 18.4 ft. over the L-1049 (L-049). to expand passenger capacity to ninety- two seats with a cruising speed of 305 mph. and a range of 5,150 miles. With auxiliary wing-tip fuel tanks, the Super Constellation could fly non-stop between New York and Los Angeles. Some pilots used to shorter runs began to complain about long days. An early problem with the 1049 Model was excessive exhaust gas flaming-sometimes past the trailing wing edge. Once the exhaust problem was corrected, the Super Connie became a highly successful airliner.
In 1955 the Constellation underwent additional updates. Though still called the Super Constellation, the Model 1649 aircraft was first designated the Super Star Constellation, finally evolving into the Starliner name by Lockheed. The Starliner was the most extensive modification of any Constellation models. The Starliner had features such as fully reclining seats for long flights, a more precise cabin temperature control, and ventilation, as well as state of the art noise insulation. The Starliner had outside improvements which included a longer and narrower wing, nearly doubling the capacity of the original Connie with twice the range at maximum payload-enabling it to reach any major European air hub non-stop from US airports. The Model 1649 also has the distinction of being the fastest piston-engined airliner flown at ranges of over 4,000 miles.
The Constellation served a number of military roles, in addition to a troop transport. In 1948 the USAF placed an order for ten Constellation transport aircraft (C-121). Several of these were deployed in support of the Berlin Airlift later that year. Six of the planes were later reconfigured to VIP transports (VC-121), one of which was used by Dwight Eisenhower as NATO Chief Of Staff. Eisenhower was so impressed with the plane, he named it Columbine. When he became President he was assigned another VC-121, which he named Columbine II. In the early 1950s, the US Navy, Air Force, and Marine Corps ordered C-121s mounted with radar domes on top to provide long-range radar for surface ships, as well as surveillance radar for command and control of aircraft. In the early 1960s, EC-121s briefly performed an anti-submarine role for the US Navy.
By the end of the 1950s, the Constellation became an aviation icon. It was in service with more than a dozen airlines, quickly becoming the flagship of Trans World Airlines. The Connie was in service with both the US military and several other government agencies, with duties ranging from tracking smugglers to hurricanes. Though expensive to build due to its tapered fuselage, the Constellation was a graceful aircraft. While being rapidly phased out by the major airlines in 1961 in favor of newer jetliners such as the Boeing 707 and Douglas DC-8, the Connie was still in use with a number of regional airlines with 856 examples built. Howard Hughes gamble in 1939 had paid off in a big way.
Though flying an rc model can be a fun activity, certain safety considerations must be observed in order to make the flight both a safe and enjoyable experience. During this blog, we’ll take a look at buying an rc plane or helicopter from the safety standpoint, as well as techniques to promote safe flying.
Real aircraft must undergo a pre-flight checklist, which is also a good philosophy for radio control aircraft. The pilot must make sure the rudder, ailerons, and elevators are functioning properly, with both the receiver battery and radio fully charged. The problem with rc planes, as opposed to rc helicopters, is the center of gravity. The center of gravity is a point at which the plane needs to balance in order to fly well. The center of gravity for a plane with a tail can be as far back as 32% from the nose of the plane of the plane, though the operator may still have to make it balance. While placement of the battery and radio can compensate for any CG imbalance, it’s always desirable to have both a light nose and tail section of the aircraft, with adjustments made at the center of the fuselage. A properly balanced plane will be more responsive to commands and use less fuel/battery charge. Flying wing designs have a common center of gravity at 23% back from the nose.
Before launching the plane, be sure the correct propellers are installed. The thickest section of the prop should be facing toward the front, The rc pilot can determine the front of the blade by manufacturer lettering. The plane will still fly with the prop(s) mounted backward, though at about a third of the power of a front mounted blade since the thicker front section displaces more air. Better quality props are more rigid, and thus more stable in flight – especially at high rpms. They are also less likely to flatten out over extended use. Getting a good launch of the aircraft is more difficult than it appears. Inexperienced rc pilots have a tendency to spin the plane, often a cause of crashes. If one wingtip is moving faster than the other, it will have more air over the wing, so the plane will roll towards the slower wing. The correct procedure is to release the plane when both wings are level and moving in the same direction at the same speed. If the model is launched at too steep an angle, it will experience an immediate stall.
Transmitters are an important element of rc flight. Many rc pilots have a tendency to fly their planes by their thumbs. Clutch the transmitter sticks on the side with the elevons or elevator-ailerons control. This offers the rc pilot more than one orientation to the controls and prevents accidental maneuvers of the rc plane. Don’t jerk the control sticks, but rather use a gradual motion from which to control the model. Proper antenna angle is another factor, since there may be local interference, which affects signal quality. Fly the plane at a close distance, using different antenna angles to determine the optimum signal. While more recent rc planes are equipped with a homing device, which returns the plane to the transmitter if the model experiences signal or line of sight interference, it’s always best to fly your rc plane no farther than your field of view. A three channel transmitter with throttle, rudder and elevator controls is usually the best for a beginner. Speaking of planes, the most important decision facing beginning rc pilots is choice of aircraft. The hard fact is the plane will experience a number of crashes until the pilot becomes more proficient. Foam is a relatively inexpensive material and easy to repair if the rc plane is damaged. While a foam aircraft construction is not the most pleasing to the eye, it provides the rc model beginner with a practical means of getting into the air. RC models may be purchased in either ready to fly (RTF) or in kit form, which must be assembled. Building your own model has the advantages of learning the parts and operating systems of the plane, as well as a lower cost. RC model planes may be powered by either gasoline engines or lithium polymer (LIPO) batteries. The use of lipo batteries has increased drastically in radio control use over the last ten years. They offer near gasoline engine performance while being more compact, with little or no maintenance.
The use of radio control aircraft, quadcopters and drones have increased exponentially over the last fifteen years. Near collisions between drones and passenger aircraft now run into the hundreds each year, with the FAA receiving in excess of 100 reports per month. While most drones weigh less than ten pounds and have a limited altitude, heavier and more capable machines are the rise. For example, even a collision between a lightweight drone and a jetliner could result in millions of dollars if the jetliner sustained damage to either the engine or control surfaces. Though the FAA has a regulation in effect for four years making it illegal to fly a drone within five miles of an airport and limiting the altitude to 400 ft., many operators who use drones in their business pay scant attention. A year later the FAA enacted a five dollar registration fee for all drones weighing more than half a pound. While ineffective at tracking drones, it may get the attention of some operators. For all the electronics and regulations, perhaps the best source of rc model safety is common sense in their use.
Pretend you’re the pilot of a large jetliner. You’ve completed pre-flight checks, both inside and outside, and are ready for takeoff. As you climb, the plane begins to vibrate and then pitch to one side. The number two engine then separates and you are faced with a decision – jettison the remaining fuel on the aircraft or make a heavy landing with fuel on board. While engine separations are not frequent occurrences of air travel, they can have tragic consequences for both the plane and the surrounding area. During the course of this blog, we’ll review two key cases involving such incidents.
In May 1979 a McDonnell Douglas DC-10 (Flight 191) was making a regularly scheduled passenger flight from O’Hare International Airport in Chicago to Los Angeles International Airport. Moments after takeoff, the aircraft plummeted downward, killing all 258 passengers along with the crew of thirteen and two on the runway. A subsequent investigation by the FAA revealed the number one engine separated from the left wing, flipping over the top and then landing on the runway. During the separation from the wing, the engine severed several hydraulic lines which locked the leading edge wing slats into place, as well as damaging a three-foot section of the wing. As the plane began to climb, it experienced a state of uncontrolled aerodynamics, in which the left wing provided minimal lift compared to that of the right wing while the engine was at full throttle. This condition caused the aircraft to roll abruptly to the left, reaching a bank angle of 112 degrees before crashing.
While the cause of the DC-10 engine loss was later determined to be due to a damaged pylon structure connecting the engine to the wing, several other factors also played a role in the crash. The hydraulic system powered by engine number one actually failed but ran from motor pumps connecting it to the engine three systems. While hydraulic system three was also damaged, it continued to provide pressure until the crash in spite of leaking fluid. Electrical problems were also a factor in the crash of Flight 191. The number one electrical bus, attached to the number one engine, failed, resulting in several electrical systems going offline including the flight captain’s instruments, stick shaker and wing slat sensors. As a result of the partial electrical failure, the flight crew only received a warning about the number one engine failure – not its loss. Though the crew had a closed circuit television screen behind the pilot from which to view the passenger compartments, it too was subject to the loss of power from the engine. After the Flight 191 incident and three other DC-10 crashes during the 1970s, a number of major airlines began to phase out the DC-10 in the early 1980s in favor of newer and more fuel- efficient jetliners such as the Boeing 757 and 767. While the phaseout had more emphasis on fuel efficiency, the safety of the aircraft cast a cloud over its service.
The DC-10 wasn’t the only wide-body jet to experience engine separation. In October 1992 an El Al Israeli Airlines Boeing 747-200 cargo plane (Flight 1862) with three crew members and one passenger on board, began a flight from John F. Kennedy Airport , New York to Ben Gurion International Airport, Tel Aviv with an intermediate stop at Schiphol Airport, Amsterdam. Weather conditions were favorable at the time of departure with all pre-flight checks performed, with no defects found. About ten minutes out of Schiphol, the flight data recorder indicated both engines 3 and 4 and their connecting struts had left the aircraft. The co-pilot transmitted an emergency call to Schiphol, requesting a return to the airport. However, the aircraft could not make a straight-in approach, due to both altitude and proximity to the airport. Therefore, the air traffic controller had to vector the El Al plane back to the airport by flying a pattern of descending circles to lower the altitude for a final approach. About five minutes into the flight pattern, the flight crew informed the controller of the loss of engines three and four and were beginning to experience flap control problems. The controller directed a new heading to the flight crew, but noticed the plane was taking 30 seconds to change headings. About three minutes later, the flight crew informed air traffic control they were receiving audible warnings indicating a lack of control and low ground proximity. Approximately twenty-five seconds later, the aircraft crashed into an eleven-story apartment building, about seven miles from Schiphol Airport.
Both number 3 and number 4 engine struts were recovered from Naarden Harbour, just east of Amsterdam with both engines attached to the struts. Remaining parts of the aircraft were located within a thousand foot radius of the impact. From an analysis of the parts and their placement, investigators were able to determine the number 3 engine separated first, traveling in an outboard direction, striking engine 4 and causing it and the supporting strut to separate from the plane. The engine struts or pylons are designed as two-cell torque boxes absorbing vertical, horizontal and torsional thrust loads to the wing, acting as an aerial shock absorber. The Boeing 747 pylon was supported internally by five fuse pins, which provide enough strength to hold the pylons in place with the exception of extreme loads, in which the pins fail, allowing the engine to break away without damaging the wing fuel tanks. This philosophy was adopted by Boeing from experiences with the earlier 707 and 727 models, in which a number of incidents of both in-ground and mid-air engine separations occurred. The crash of the El Al jetliner was attributed to a failure of a center fuse pin in the number 3 engine strut. The pin cracked due to metal fatigue and was a bottle bore design. The FAA issued a directive in 1979 requiring airlines to conduct inspections of the fuse pins every 2,500 flight hours as the bottle design was prone to fail at that point. The El Al 747 was one of a few aircraft which had not replaced their bottle pin units. As a result of the El Al crash and two other 747 crashes, the FAA mandated a retrofit of all Boeing 747 wing struts in 1995. The new strut design offered increased protection in the event of an engine separation, while still using fuse pins to protect the wing tank from damage during ground impact.
As the two previous cases indicate, engine separations may result from a number of problems. Sometimes it’s a matter of faulty parts, while lack of proper maintenance plays a role in others. The overall design of the aircraft itself may be a factor. However, the safe operation of an aircraft requires a continual interplay of aviators, air controllers, engineers and the flying public to promote flight safety.
In late 1945 the USAAF was at a crossroads. While the B-29 Superfortress was a capable platform in carrying the war to Japan, future requirements dictated an aircraft of intercontinental range, in excess of five thousand miles. The Convair B-36 Peacemaker met this requirement, but would not enter service for three more years. Further complicating matters, General Curtis LeMay and several other forward thinking generals were considering a jet powered bomber. However, within a few years, the generals and engineers got together and designed a truly great jet bomber – the Boeing B-52 Stratofortress. During this blog we will tell the story of the B-52, its development and its long service record with the USAF.
In addition to the range requirements of the aircraft, other performance characteristics specified by the Air Material Command in 1946 were a cruising speed of 300 mph. at an altitude of 34,000 ft., with a minimal payload of 10,000 lbs with five or six 20mm. gun turrets. The AMC issued bids later that year with Boeing, Glen L. Martin and Consolidated Aircraft submitting proposals. The Air Force accepted the Boeing proposal, an aircraft powered by six turboprop engines with a range of 3,110 miles. The Boeing plane, designated Model 462, was a straight-winged aircraft with a gross weight of 360,000 pounds – a heavy plane for its day. As a result of the weight issue, the Air Force began to have doubts about the ability of the aircraft to successfully perform its mission. Boeing then offered a smaller follow-up design, Model 464, having four engines and a 230,000 pound gross weight. While the 464 aircraft was deemed acceptable, the Air Force changed its requirements within a few months to a plane having a 400 mph cruising speed, with a 300,000 pound gross weight. Additionally, the Air Force wanted an aircraft with a range of twelve thousand miles, capable of delivering a nuclear weapon. These modifications increased the gross weight of the plane to 480,000 lbs.
Boeing responded by proposing two bombers, Model 464-16 and Model 464-17. Both planes were four engine turboprop designs, with the Model 16 being a nuclear only aircraft carrying a ten thousand lb. payload. The Model 17 bomber was a conventional bomber, able to mount a 9,000 lb. payload. By mid 1947 the Model 17 aircraft was deemed acceptable by the Air Force, except for the range requirement. By now, designated the XB-52, the aircraft offered only marginal performance in speed and range over the Convair B-36, which was about to enter service. The Air Force then postponed the project for six months in order to evaluate its potential. After a series of intense discussions between Boeing and the Air Force, the XB-52 project was back on track in January 1948, with Boeing urged to include the latest aviation innovations in the bomber design such as jet engines and aerial refueling. In May 1948, jet engines were substituted for turboprops which satisfied the Air Force. However, the Air Force still wanted a turboprop design, since jet engines of the era lacked fuel efficiency. October 1948 proved to be a crucial month for the XB-52 project. Boeing engineers George Schairer, Art Carlsen and Vaughn Blumenthal presented a refined turboprop design to Colonel Pete Warden, Director of Bomber Development for the USAF. After reviewing the proposal, Warden asked the Boeing design team if they could prepare a proposal for a four engine turbojet bomber. The following day Colonel Warden scanned the design, requesting an improved version. After returning to their hotel room, Schairer, Carlsen and Blumenthal were joined by Ed Wells, Boeing Vice President of Engineering, in addition to two other Boeing engineers, Bob Withington and Maynard Pennell. After eight hours of intense deliberation, the Boeing team had designed an entirely new airplane. The new concept of the XB-52 had 35 degree swept wings, based on the B-47 Stratojet, with eight engines paired in four pods below the wings with bicycle landing gear and outrigger wheels underneath the wingtips. The XB-52 also had flexible landing gear, which could pivot 20 degrees from the aircraft centerline to compensate for crosswinds upon landing. Warden approved the design the following week and the Air Force signed a contract with Boeing in February 1951 for an initial production run of 13 B-52As.
When the B-52 entered service in 1955, it was assigned to the Strategic Air Command (SAC) to deliver nuclear weapons under the doctrine of massive retaliation. Carrying a 50,000 lb. payload coupled with the capability to fly nearly half way around the globe, the Stratofortress was ideally suited for its role and soon became the standard for future bomber aircraft. Three B-52s from March AFB set a record around the world flight in 1957. However, it had its share of teething troubles, as with all aircraft. For example, the split level cockpit had climate control problems, while the pilot and co-pilot had sunlight exposure on the upper deck, the navigator and observer nearly froze on the lower deck. Early B-52 models were often grounded due to both electrical and hydraulic issues, with the Air Force assigning contractor teams to B-52 bases, troubleshooting problems as they arose.
By the late 1950s, advances in Soviet surface to air (SAM) missile capabilities brought about a major upgrade in the electronic countermeasure capabilities of the B-52. This situation also caused SAC to change its philosophy from high altitude bombing to low level penetration. The switch to low altitude bombing required a number of modifications to B-52 component parts. Such features as an updated radar altimeter, structural reinforcements, modified equipment mounts, an enhanced cooling system, as well as terrain avoidance radar were necessary to support missions flown at altitudes as low as 500 ft. By the end of the decade, B-52 capabilities increased with the addition of the Quail and Hound Dog missile systems. The Quail, a decoy missile, was carried in the aft bomb bay of the B-52 and launched while in flight to the target. The missile was programmed by the crew to match the speed and altitude of the B-52, thus confusing Soviet radar. Each Stratofortress carried four of these, in addition to the regular nuclear payload. North American’s entry, the AGM-28 Hound Dog was an offensive missile launched from the B-52 to carry a nuclear warhead to its target. With a mach 2 speed and an altitude variance of from 500 to 60,000 ft., the Hound Dog was able to penetrate enemy air defenses to a range of 600 miles. The primary drawback of the Hound Dog was its weight. At 20,000 lbs. each, the B52s could only carry two of them with a corresponding fifteen per cent loss of range.
The 1960s saw a change of doctrine for SAC. With the emergence of both land-based intercontinental ballistic missiles (ICBM), as well as sea-launched (SLBM) missiles from submarines, the manned bomber force became a leg of a nuclear triad. The primary advantage of the missile legs were their relative invulnerability to enemy attack. They were also cheaper to operate than a manned bomber fleet. Both ICBMs and SLBMs offered a quick response to an enemy attack, while a response from manned bombers was more time sensitive. The growing threat from Soviet ICBMs was another factor countering the effectiveness of the manned bomber leg. Due to the potential for conflict in Berlin, Cuba and a number of third world countries, the Kennedy Administration decided to scrap the policy of massive retalation, replacing it with the doctrine of flexible response. Instead of having a large nuclear umbrella with small conventional forces, those forces were increased in order to keep any potential war from escalating to the nuclear threshold. Under the flexible response doctrine, nuclear weapons were to be used in a limited role against selected targets. Thus, the B-52 had a new mission, to loiter on patrol at the edge of Soviet airspace, ready to strike designated targets in a retaliatory role. The Stratofortress was the ideal plane for the job, having the range, speed and payload, as well as an aerial refueling capability.
While the B-52 was designed as nuclear weapon delivery system, it served an entirely different purpose in Viet Nam. In 1964 seventy-four B-52s were modified with external bomb racks, which could carry an additional twenty-four 750 lb. bombs. The following year Operation Rolling Thunder began, in which the USAF commenced bombing missions in both North and South Viet Nam, with the primary role of the Stratofortress to support ground operations in the South. The first mission, Operation Arc Light was conducted by B-52s in June 1965, bombing a suspected Viet Cong stronghold in the Ben Cat District in South Viet Nam. Twenty-Seven B-52s participated in the raid, bombing a one mile by two mile box. Though only partially successful, the raid proved the potential of the B-52 as a ground attack weapon. Later that year, a number of B-52s underwent modifications to increase their capacity for carpet bombing. These raids were devastating to anyone in or near the target areas. B-52s bombed North Viet Nam in late 1972 during Operation Linebacker II. These missions were successful in leading to the peace talks which ended the war, although at a loss of 15 Stratofortresses. During that campaign, B-52 gunners claimed two North Vietnamese Mig-21s – the first hostile aircraft shot down by the plane.
The Stratofortress went on to provide ground support in Operation Desert Storm in 1991, Operation Allied Force in Serbia in 1999, Operation Enduring Freedom in Afghanistan in 2001, as well as Operation Iraqi Freedom in 2003. During its career, the B-52 has proven itself both a durable and an adaptable plane, receiving a number of modifications during its 63 year career. It has dropped bombs, launched missiles, served as an experimental platform, in addition to launching the X-15 rocket plane. Current efforts by Boeing to re-engine the Stratofortress are projected to extend its service life through 2040. One could say of the B-52, it’s the plane that keeps on flying.
During the last five years, the use of and uses for drones have increased exponentially. In this blog, we’ll trace the employment of drones in a number of industries.
While much of the current drone technology isn’t new, recent investments in both capital and technology have made drones a practical tool in a number of industries. The agricultural sector is one in which drone applications are on the rise. With the global population projected to reach about 9 billion by 2050 and agricultural consumption to increase by 70 per cent during the same period, the use of drones in agriculture has the potential of revolutionizing that sector of the economy. Such drones are high-tech systems which perform many tasks a farmer can’t, such as conducting soil scans, monitoring crop health, applying fertilizers and water, even tracking weather and estimating yields, as well as collecting and analyzing data. With the FAA currently streamlining regulations for agri-drone use, the market for such systems has the potential for approximately 80% of all drones produced, according to a recent study by Bank of America Merrill Lynch.
A number of construction companies are exploring the possibilities of utilizing drones or UAVs (Unmanned Aerial Vehicles) in that industry. Drones have a number of roles in the construction industry: among them are marketing, surveying, inspection, progress reporting, safety and monitoring workers at multiple sites. In the survey role, drones allow contractors to get detailed information about a job site, as well as conditions on surrounding properties. While site surveyors are necessary in some situations, drones can perform essentially the same function at a fraction of the cost. In the realm of construction inspection, drones offer a high degree of flexibility. For example, drones can effectively scan the roof of a skyscraper, revealing any possible construction faults. They are also useful at sites such as tunnels and bridges, which may be inaccessible from the surrounding land. The contractor can even use the drone to compare the construction to the actual plans of a project. Drone photography can be utilized to show aerial views of a site from different angles to determine feasibility of construction. These photos can be sent to a number of potential contractors during the bid process. The same capability is also useful to show job progress to developers, who may not be able to visit the site on a regular basis. Finally, drones provide a means of monitoring the safety of workers at multiple sites, keeping the contractor informed of any safety issues on a real time basis, requiring a fraction of the manpower and cost of on site supervisors.
Drones also have potential in the commercial sector. For example, Wal Mart is currently utilizing drones comparable to those used in agriculture to scan warehouse inventory, checking for missing or misplaced items. Drones flying through a warehouse are able to complete an inventory in a day – a task that would take an on site warehouse crew a month. Though in its early stages, a few major companies are using drones for delivery purposes. Dominos Pizza began a delivery service in Britain, in which a drone was able to deliver two pizzas per trip. This service has the obvious advantage of avoiding traffic jams. In Philadelphia, a dry cleaning service is using drones to make emergency deliveries of laundry to customers. Though weight restrictions are a problem, they are capable of flying a freshly cleaned suit to a customer’s front door. The latest evolution is party drones, which fly over an outdoor party, playing prerecorded music.
While drones haven’t been adopted on a mass scale, they have increased the functionality of a number of key industries, breaking through the traditional barriers. From quick deliveries, to monitoring construction progress to agriculture, drones increase work efficiency and productivity, improving customer service, safety and security – with little or no manpower. According to a recent Price Waterhouse Coopers study, drone related activity provides an economic boost of more than $127 billion globally. With the relaxed FAA flight rules approved in 2016, drone operators have more flexibility from which to operate. As it becomes cheaper to develop industry-specific drones, subsidiary niche markets will emerge. A recent study indicates the use of commercial drones could add $82 billion and 100,000 jobs to the national economy by 2025 – not bad for a young industry.
While the United States was a pioneer in aviation development during much of the twentieth century, many of its airports border on a state of decay. During the course of this blog, we’ll examine the current state of the nations airports, as well a number of proposed solutions.
Though many complain about airports, often as a result of troubled airline experiences, perhaps comparing major air hubs in the United States to their more modern overseas counterparts is unrealistic. Each airport has its own unique history in relation to the communities they serve. Aviation development in the US increased dramatically after World War II with airport construction complementing that effort. Many of the prime airports in the United States were conceived in an era before the proliferation of both foreign and domestic air routes. Most airport renovation efforts over the last 30 years have involved a limited patchwork process, since many of the hubs are surrounded by urban areas – unlike the modern air hubs of Asia and the Middle East, which serve emerging markets and emphasize architecture and aesthetics over serving large volumes of passengers. For example, Dubai’s main airport covers an area of about 7 million square feet, designed to serve from 25-30 million passengers per year, while the Jet Blue terminal in JFK airport serves approximately 22 million passengers per year in an area less than 1 million square feet. Post 9/11 security and related requirements have also placed additional stress on US airports. The financial and environmental costs of airport construction often make such proposals a political liability. The ownership and control of airports in the United States, a landlord-tenant model between the airlines and the municipalities, also serves to inhibit progress.
Given the constraints on space in many urban areas, airport designers are forced to move up rather than out. In a practical sense, any airport restructuring begins with the check-in process. By placing security and check-in on separate levels, traffic flow is segregated between the two functions. Such an organization divides passengers into two categories – those who are able to check in with the aid of mobile devices, and those who use the more traditional (paper) approach and may require assistance to board their flight. Both groups must pass through security before boarding . Such an arrangement could cut pre-flight processing time by as much as 40 per cent. As mobile technology becomes more dominant, it offers air carriers both the convenience and flexibility to book flights outside the confines of an airport. Satellite check-in sites at hotels, restaurants and shopping centers allow airlines the option of verifying and staging passengers from remote locations, requiring less staff and processing time. Delta airlines, for example , has set up its own security service at a major airport from which to process passengers. This concept provides both security and marketing benefits.
A recent trend in airport check-in procedures is the use of self-service technology. Miami International Airport purchased approx. 45 automated kiosks, to reduce customs and immigration processing time. These automated kiosks can process a passenger within two minutes, making what was once a grueling check-in process a relatively seamless one. Several major air hubs are embarking on outside improvements to enhance the passenger experience. For example, Chicago O’Hare began a $15 billion capital investment program in 2005, transforming the current system of intersecting runways to a series of parallel ones, which will increase capacity by 60 % while substantially reducing delays. An additional control tower, runway and cargo center are under construction at O’Hare and are slated to be operational in about three years. Los Angeles International began an $8.5 billion expansion program in 2006, with construction completed on the New Tom Bradley International terminal in 2013 with new dining, gates and retail areas designed to meet the needs of international tourists. Related projects include the updating of Terminal 6 to meet the needs of large scale aircraft, such as the Airbus A380. LAX is also building a new Central Utility Plant, as well as taxi and runway improvements.
Understanding how the above innovations affect terminal operations will be the key to the future success of the nations airports. As air traffic continues to grow despite economic and other setbacks, passengers will continue to demand more control over their travel experience. Airport planners must continue to emphasize key passenger services such as transit, parking and baggage claim to remain competitive, while focusing on the core mission of airports as gateways to the world.
After my grandson flew his newly purchased quadcopter a few weeks ago, I was stunned by the quality of video produced by its camera. During the course of this blog, we will trace the development of compact cameras, as well as their effect upon the radio control models.
The evolution of drone cameras began in 1901, when renown photographer George Lawrence conceived the idea of attaching a camera to a balloon to take photos of banquet halls and outdoor ceremonies. Lawrence developed a panoramic camera with a relatively slow shutter speed, which proved idea for area photographs. While his first balloon pictures were a success, both he and the balloon crashed, with Lawrence surviving a 200 ft. fall without injury. He then developed a camera platform using a series of kites connected by bamboo shafts to support the weight of the camera and ran a steel piano wire from the ground up to carry the electrical current that would trip the camera shutter. The photos were retrieved by parachute. This system was so successful, Lawrence used it to photograph San Francisco after the 1906 earthquake – from which he earned $ 15,000.
However, it wasn’t until the advent of digital technology in the 1970′s, which allowed photography to become more adaptable, that compact cameras became feasible. A digital camera is a hardware device, which takes pictures like a conventional camera, but stores the image to data instead of printing it to film. Most digital cameras are now capable of recording video in addition to taking photos. Perhaps the earliest precursor to digital photography occurred in 1957 in which Russell Kirsch, a pioneer of computer technology, developed an image scanning program utilizing a rotating drum to create images, the first scanned image a picture of Kirsch’s son. By 1969 a charge-coupled semiconductor was created by ATT Bell Labs, in which a semiconductor was capable of gathering data from photoelectric sensors, then transferring that charge to a second storage capacitor. Analog data could be transferred from a light sensitive chip, which could be converted into a digital grid, producing an image. In 1974 Bell Laboratories developed a charge transfer system, which could store and transfer charge carriers containing pixel data in serial order. This system was further refined by Bell in 1978, in which a charge transfer imaging device was produced using solid state technologies. This system was both more cost-effective, as well as preventing smearing aberrations created by similar image capture devices.
In 1973 Eastman Kodak took a gamble and hired Steve Sasson, a young electrical engineer. Sasson was one of a small cadre of electrical engineers employed by Kodak, a company well known for its chemical and mechanical engineering projects. Sasson was directed to capitalize on the capabilities of a charge-coupled device created by Fairchild Semiconductor, which could transmit and store images of 100 by 100 pixels. In 1975 Sasson completed a prototype camera incorporating a charge-coupled device, adapting a lens from an eight millimeter film camera, an analog -to-digital converter from a Motorola digital voltmeter, and a digital-data cassette recorder for storing image data. With this combination, Sasson and other Kodak technicians could capture an image and record it to a cassette in a mere 23 seconds.
By 1990 several companies began to enter the digital photography market, creating a new segment for consumer cameras. The first digital camera ready for sale in the US market was the Dycam Model 1, which came out the same year. The Model 1 was capable of recording images at a maximum resolution of 376 pixels by 240 pixels. Two developments in the 1990′s further enhanced the marketability of digital cameras. The first was a codec, utilized for image compression, the precursor to the JPEG image file format of today. This system exponentially increased the storage capacity of digital cameras over prior magnetic tape and floppy disc storage systems. By the mid 1990′s Apple began to market the Quick Take 100, the most widely marketed digital camera in the United States. The Quick Take had a maximum resolution of 640 by 480 pixels and could store up to 24 images in 24 bit color. In 1995 Casio released the QV-10, the first consumer digital camera to include a to include a liquid crystal display (LCD) screen, which quickly allowed camera owners to review newly photographed images. Other developments in the 1990′s included a pocketable imaging device with an LCD screen capable of displaying images from a camera storage device, as well as a single- lens digital reflex camera, which could reproduce camera images in 35mm film quality. By the end of the decade, digital cameras had a resolution of 2,000 pixels by 2,000 pixels.
In the early 2000′s a merging of digital camera and lithium polymer battery technologies took place. In the latter case, the flexible polymer battery began to deliver near gas engine performance with the attributes of less weight and volume on the rc model frame. Digital cameras were now both lightweight and efficient, capable of both still photos and video covering a relatively wide area. By 2010 a number of drones and smaller quadcopters carried flash drive units, which could be inserted into the rc model camera to record flight video. Once on the ground, the rc pilot would then insert the drive unit into the USB connection of a personal computer, playing the video of the quadcopter flight on the computer monitor screen – a far cry from pulling piano wire to trip a camera shutter.