Part of that dream included the comforts of home ownership and the opportunity to start a family. Although the housing boom began shortly after World War II, it wasn’t until 16 million veterans actually returned from the war that America went on a full-fledged housing spree. The baby boom was well on its way. By 1954, more than four million babies were being born each year, peaking at 4.3 million in 1957.
To keep up with the residential housing demand, especially in the Southwest, large builders began building mass-produced, “cookie cutter” tract housing, making owning a home affordable to millions.
The assembly-line methods employed by subdivision builders offered “modern” homes with panelized construction, drywall instead of wet plaster, and trusses, so more homes could be built rapidly. The returning GIs were interested in low-cost housing, low down payments, and long-term mortgages. Tract housing filled that need.
According to Domestic Engineering (Oct. 1981), “Three out of five families became homeowners and suburban living became a national phenomenon.”
The hvacr industry now had the opportunity to experiment with a variety of low-cost heating methods, which included designing and implementing heating systems for the trendy new split-level homes, homes built on crawl spaces, and homes built on slab foundations. It was the first time in history that homes had ever been built on just a slab of concrete, according to the book called The Quiet Indoor Revolution by Seichi Konzo (University of Illinois, 1992).
Contractors faced the daunting task of adapting heating systems to the new structural designs, each with its own heat loss characteristics. After all, furnaces didn’t necessarily have a basement to go into, and the houses were going up too fast to wait for ductwork to be assembled. The hvacr industry had to adapt to the structures “on the fly.” According to the The Quiet Indoor Revolution, 1950s homes had the following heating characteristics:
Panel heating was popularized for residential applications in the 1950s by renowned American architect Frank Lloyd Wright. Panel heating was designed to warm cold slab floors, radiating heat upwards, where surfaces, objects, and people would then absorb the heat.
Wright used hydronic radiant floor heating at Fallingwater in western Pennsylvania. Wright’s challenge was to find a way to reliably heat the home throughout the winter, despite the fact that it was built over a waterfall. Powered by a small, oil-fired boiler, Wright chose radiant floor heating, which has been heating the Fallingwater home for more than 50 years.
The hydronics and hvacr industry tried to adapt floor panel heating for houses built on a slab, but they came across two major problems. One of the problems was trying to do maintenance on broken copper pipes or ducts embedded in concrete and compacted soil. The other was controlling heat loss from the slab.
Since concrete is a good heat conductor, there was high heat loss associated with homes built on slabs that were heated with radiant floor panels. Another shortcoming was that during spring and fall, the slab was too slow to respond to the change in temperatures.
Homeowners liked radiant heating because it was comfortable and quiet, but the system couldn’t compensate for colder weather because the floor couldn’t exceed 80?F. If it did, a homeowner would experience “hot foot” conditions. Carpeting often was responsible for reducing heat transfer.
Service could also be problematic. If there was a water leak, the subfloor piping would all have to be torn out and replaced with baseboard piping above the slab floor. In addition, high material costs and the increased demand for air conditioning made the systems less practical.
Eventually, the industry returned to installing baseboard radiator equipment in new homes built on slabs. The resurgence of radiant heating would not come to the forefront again until the 1970s, with the introduction of nonmetallic tubing.
This is where a furnace, water heater, and washer and dryer could be installed in a separate utility room outside of the immediate house.
If the furnace room was in an attached garage, installers had a hard time getting enough chimney height and providing adequate insulation for ducts running to the house. It was difficult to prevent heat loss from the furnace, pipes, and vent.
If the furnace was inside the house, there were more options available to the installer. Figuring out the best heating solution was a challenge. According to The Quiet Indoor Revolution, “The furnace could be upflow with warm-air ducts at the ceiling or attic space, downflow with warm-air ducts below the floor, or horizontal flow with the furnace placed in the crawlspace.”
Ranch houses built over a crawl space sometimes had two or three wall heaters, or subfloor heaters fired by natural gas or propane. Some wall heaters had oil burners vented to metal chimneys.
The wall heaters were unpopular with many central forced-air heating contractors, according to Konzo. They considered wall heaters to be a step backwards rather than an advancement in heating technology because they thought the heaters were dangerous, and didn’t offer the same well-rounded comfort that the new central heating systems did.
There was also a potential fire risk because the flames in the heaters weren’t always enclosed, according to The Quiet Indoor Revolution. The hvacr industry worried the media would hype this heating application as a godsend for heating the masses, undermining the industry’s recent heating advancements with what they considered an inferior device.
Those fears were short-lived, however, after deficiencies in the units were identified, and many wall heaters were replaced and upgraded.
Alternative layouts and shallower ducts were implemented, but sooner or later occupants would bump their heads somewhere in the basement because the ducts would have to be lower than 6-ft, 2-in. high. Often in split-level homes, duct design required two ducts for a space where there normally should have been only one. This created a headache for installers.
Installers also used registers at the floor or baseboard level in split-level homes to ensure more even temperature gradients. This housing design was popular because the lower rooms stayed cool while the upper rooms allowed potential warm air movement.
In those compact houses, builders envisioned a compact furnace, concealed in its own little closet, outside the confines of the basement if possible. Then the basement could be used for leisure activities.
In the 1950s and into the 60s, 98% of the population had electric power, with the growth in electricity use reaching a peak of 8% per year, according to the American Institute of Physics.
The process for making electricity, however, wasn’t nearly as efficient as it is today. Home-owners were encouraged to use more electricity to keep electric costs low. Researchers in the 1950s viewed electricity as plentiful, requiring minimal maintenance, with none of the requirements for ventilation needed to burn a fuel.
As part of the campaign to keep electricity demands up, in 1956, the electricity industry launched the “Live Better Electrically” (LBE) campaign, which was supported by 180 electrical manufacturers and 300 electric utilities, according to a Los Angeles Times Aug. 13, 2001 article, “The All-Consuming Bills of an All-Electric Home,” by Diane Wedner. The campaign even had Ronald Reagan, host of “General Electric Theater,” do a TV tour of his wife Nancy’s all-electric home in the Pacific Palisades to promote the use of electricity, The Times revealed.
According to The Times article, an in-house G.E. sales pitch declared that “By Thanksgiving, there should not be a man, woman, or child in America who doesn’t know that you can ‘Live Better Electrically’ with General Electric appliances and a television.”
Critical of the widespread acceptance of electric heating without statistical support, a letter written to The News on Oct. 12, 1959, from G. Gatchell, Gatchell & Burwell Consulting Engineers, stated the following: “We have read page after page of propaganda, listing the advantages and benefits of electric heating, but no specific information on actual first cost and operating costs of systems which have been installed. If you desire, we would be happy to send you unbiased, factual reports that we have made to church and school boards listing comparison costs of various heating systems for their proposed buildings. None of them have seen fit to have electric heating systems installed after reading our reports, despite pressure extended by Utility Com-panies and Sales Engineers.”
One of the biggest draws in persuading Americans to convert to electricity as their primary energy source was the new heat pump, which was featured prominently in the “All-Electric Home” campaign.
For cooling, heat pumps use the vapor refrigeration cycle; to provide heating, heat pumps essentially run the cycle backwards, removing heat from the outside and depositing the heated air inside.
A promotional brochure published by Kansas City Power and Light Company in 1953 and 54 called the heat pump “probably the most sensational apparatus — just now in the field-testing stage. …This new ‘heat pump’ already shows promise of being one of the most significant developments of our time.” Like any new development, the heat pump had some bugs to be worked out. Some early units were considered unreliable and inefficient.
Equally successful in the market during the 1950s was oil heating unit sales. In the Nov. 9, 1959 issue, The News reported that “Popularity of oil heat as indicated by sales of central heating units, continues to increase, according to the Oil Heat Institute of America’s Market Research Dept., which reports factory shipments in August of some central heating units are up to as high as 40% over August of 1958.”
The Midwest favored natural gas; it was easy for the gas fields in the Gulf States to connect to the Midwestern states through large, 30- to 60-in.-dia pipelines. The first large natural gas markets were the industrial northern Illinois and southern Michigan regions.
Making the transition to gas was expensive for buildings that were once part of a heating district because all of the buildings had to be converted. Feasibility studies were conducted to determine where everything would be installed in a building’s basement; the potential area dedicated for the chimney; where gas piping might be situated; and how condensate piping could be modified to a forced hot water system.
After the feasibility study was completed, the actual design and installation of the system was left to engineers and heating installers. Natural gas was here to stay.
East Coast homeowners switched from burning coal to oil during the war, since they were already accustomed to oil that had been piped to the East Coast cities throughout World War II. It was not until the 1950s, however, that oil gained widespread acceptance throughout the East Coast.
The main oil heating manufacturers were located in the Midwest. Williams Oil-O-Matic, producer of low-pressure oil burners, was based in central Illinois, and the Timken high-pressure oil burner was produced in southern Michigan.
Oil-burning equipment 50 years ago, however, was not as refined as it is today. Users were confronted with poor uniformity in the oil that was delivered to them. Lower-quality oil filters clogged easily, and clogging oil formed carbon at fuel nozzles and spark gaps.
Inexpensive, prefabricated chimneys composed of metal or incombustible material were designed especially for gas and oil furnaces. The chimneys were designed to be resistant to corrosive flue gases and condensate, and could be easily assembled in the field.
However, some chimneys had flue temperature limits, and if they were overheated, they rapidly deteriorated.
It was later found that burning natural gas creates a high hydrogen content, causing heavy moisture condensation. This, in turn, eventually causes the chimney to crumble. As a remedy, metal liners were installed in the chimneys.
Based on transistor technology of the 1950s, Bell Laboratories in the United States created the first photovoltaic (PV) cells in 1954, with the invention of the silicon solar cell.
Researchers discovered that by using chemically altered silicon, photons from the sun’s rays energize electrons, causing them to flow towards metallic conductors, resulting in an electrical circuit or solar cell.
In the beginning of the 1950s, PV cells were used primarily in the space industry to power U.S. satellites. They are still the main energy source for manned and unmanned space launches today.
The earliest solar energy collectors were fixed devices that faced south and didn’t track the sun, like sophisticated collectors do today. Although solar energy wasn’t widely used in the 1950s, the discovery of the energy helped pave the way for the alternative energy exploration which increased during the energy crisis of the 1970s.
The benefit of using solar energy is that the sun’s supply of energy is unlimited, there is no ongoing fuel cost, and it doesn’t harm the environment. The disadvantages of using solar energy include a high initial installation cost, lack of sun during the day in colder climates to collect enough energy, and the supplemental heating required is usually electric, which is more expensive.
The first geothermal energy source established for power production was drilled in 1924 in an area called The Geysers, north of San Francisco. Deeper wells were drilled later in the 1950s to tap into additional sources of energy.
Geothermal energy is clean, considered safe, reliable, is a renewable energy resource, and can heat water and air.
Publication date: 11/12/2001