top of page
vjfideacenter

The Job Journey: A History and Evolution of Employment Applications (Part One)

Updated: Apr 22

Vincenzo Frosolone

June 23rd, 2023

The Job Journey: A History and Evolution of Job Applications (Part One)

Abstract: Part One of this essay describes the history of the job application from the sixteenth through the twenty-first centuries. It delves into the historical backdrop behind the form that we know and love today. Little did I realize how monstrous this project would become. I have experienced time and again the phenomenon of starting a research project with one goal in mind and encountering relevant tangents and rabbit holes along the way. The creator’s curse is truly endless, but it is the creator who ultimately decides their spell. Anyway, enjoy the essay.


It is ironic that those without a job have a harder time finding one (Pedulla 1479-1480). In spite of this strange yet unfortunate truth, the United States has been bouncing back from the COVID-19 pandemic. As of April 2023, the unemployment rate was 3.6%, which is almost one quarter of the extraordinary April 2020 rate of 14.7% (U.S. Bureau of Labor Statistics) shortly after the pandemic status was issued by the World Health Organization that March. There is hope yet, citizens.

Fellow citizen, I understand that you are seeking stability, a venue for your passions, an outlet for capitalizing on your past experiences— you need a job. Be not ashamed that you are, as of June 2023, merely one of more than twelve million unemployed Americans (U.S. Bureau of Labor Statistics), for every common citizen has had to or will have to, at one point or another, complete a dreaded job application form.

A staple of the American dream for more than a century, the honorable job hunt has never been without its share of missed targets. By targets, I mean demographics: people of color; women; people of various ethnic backgrounds, sexualities, and creeds. Certainly your acquisition of a job will not satisfy these hungry ghosts which haunt the streets and wires of the union, yet perhaps you can one day observe change as a cog in the machine evolving as more debates concerning rightful rights jam it.

Meanwhile, I hope that you can stick around as I guide you along the job journey, namely the history of the job application as we know it today and, in Part Two, the backgrounds of each individual section of the application form. The recruitment rabbit hole goes deep. Past this glass door, this job journey is certainly long and tortuous, but indeed we will zip through the application process so that you can relax and wait for your callback.

Your typical job application form will be a document that requires applicants to answer certain— yet certainly not all— questions about themselves in order to rank their eligibility for open positions. Employers may order the forms, also known as blanks, from vendors, or produce them with the help of their attorneys using the information most commonly baked into resumes (Inc.). Online resources for job application blank formats include Jotform, Aidaform, and Wix.

Thousands of applicants file these forms daily, and hiring managers drown in them. They must quickly scan the blanks for worthy candidates. They may resort to categorizing submissions into heuristic stereotypes to save time (Pedulla 1479). At home in their beds, they wonder how this mess came to be, how so many thousands of people wade in the applicant pool and pile on the application blanks and stress them out.


Part 1: The History

Pre-Revolution Era: 1591 to 1750

Mail was the dominant means of human communication before the invention of the telegraph. For that, the United States had none other to thank than the independent English cloth parcel delivery service The Merchant Strangers. Queen Elizabeth I of England and Ireland stifled their operation in 1591 once she prohibited any entity but the royal post from delivering foreign letters so that they could reduce inspection costs (Priest 34-35).

Throughout the United Kingdom, royal monopolization censored, impeded, and eventually outlawed private correspondence. (Priest 35-36). The Restoration Act of 1660 later tasked royalty with recruiting postal contractors and their profits rather than assigning the position to top bidders (Priest 37), until King Charles II in Scotland gave the responsibility to his closest friends and allies; the problem was, they were not dedicated to the task (Priest 38-39). Finally, in 1667, the postal service became a government agency (Priest 39). In 1692, King Charles II tasked one of his lackeys, Thomas Neale, with finally promoting intercolonial correspondence after Neale’s prior failed attempts to do it on his own (Priest 41-42).

Neale was a groomporter who attended to the King’s dice, cards, lottery, and coinage; he was also “Commissioner for Wrecks for the Coast of Bermuda” (Priest 42). King William III and Queen Mary II over in England gave Neale a twenty-one year monopoly over intercolonial correspondence as long as he maintained the rates from 1660 (Priest 42). This was done in order to “increase expected return to encourage investment in invention” as they did not really expect the operation to withstand its full duration (Priest 43). Neale, feeling untouchable I suppose, commissioned Andrew Hamilton, the contemporary governor of New Jersey, to institute all of the arrangements (Priest 43). When King William III ordered an audit on the operation six years later, Hamilton revealed that “expenses had been almost three times greater than receipts” (Priest 44). When the Postmasters General of England asked Neale to pay up, he instead offered to sell the operation back to royalty; they refused, leading Neale to die a bankrupt, unfulfilled man a year later having not accomplished his goal (Priest 44). Surplus responsibility would be bequeathed to Neale’s creditors until 1707, when the Postmasters General would agree to take it back; they would not see a profit for almost 55 years (Priest 44).

Meanwhile, in the United States of America, John Campbell experimented with a different delivery system, not of letters but rather of ideas. He published the first classified newspaper advertisement in The Boston News-Letter on April 24th, 1704, which also happened to be the newspaper’s debut publication. The only advertisement in the copy advertised itself and read:


This News Letter is to be continued Weekly; and all persons who have any Houses, Lands, Tenements, Farms, Ships, Vessels, Goods, Wares or Merchandizes, etc. to be Sold or Let, or Servants, Runaway; or Goods Stole or Lost; may have the same inserted at a Reasonable Rate; from Twelve Pence to Five Shillings, and not to exceed: Who may agree with John Campbell, Postmaster of Boston. All persons in Town or Country may have the News-Letter every Week, Yearly, upon reasonable terms, agreeing with John Campbell, Postmaster, for the same (Mabie 603).


Classified advertisements propagated in newspapers of either continent. First came an early help wanted advertisement printed in London’s The Post Boy newspaper for the January 18th through January 21st edition in 1717; Plaisterers’, a London livery hall, requested a waiter to serve meat and beverages to their esteemed guests (Morphew 2).

Around this same time, poet and translator Alexander Pope figured out that customers expressed loyalty when they felt honor and recognition from inscribing their names onto an exclusive subscription list and, perhaps more importantly, that they would keep supporting the publication for a longer period of time (Mason 31-32). Maybe the editors of London’s Daily Advertiser attempted to mirror this idea for its first three weeks in 1730, publishing literally only advertisements (Innis 3). They realistically could have realized the full potential of this scheme by advertising exclusive deals to subscribers as periodicals had begun to follow this trend during the 1720s (Mason 32; 33). Also in 1730, In Philadelphia, Pennsylvania, Andrew Bradford, in his publication The American Weekly Mercury, took a shot at classified advertising when he posted requests for woodworkers, bricklayers, and hatters, and may have been one of the first colonists to do so (Shaw 416). Subscribers to book publishers got to read their names and occupations on the subscriber list and therefore felt closer to their communities (Mason 32), so when help wanted advertisements called for candidates with specific skill sets, citizens viably could have discerned local competition, spurring them to apply sooner than later. With the hundreds of shops sprawled across England in the 1730s came a wind of change as newspapers advertised stores which offered a new convention of fixed prices, eliminating the need for negotiation (Martin 174)— at least temporarily.

It seems that an air of exclusivity which had animated subscription services, specialty shops, and fixed prices affected the temperaments of consumers, as masters advertising apprenticeship programs (Sanders 38) attracted legions of English workers to their programs in the first half of the 1700s, earning ample revenue for their sovereignty under Queen Elizabeth I (Sanders 36). Adolescents as young as fourteen could train with masters, for a fee of course, in order to fortify their demeanor (Sanders 37) and develop specific skills; it was fairly common for nepotism to weigh the scale toward one opinion versus another, but in London, applicants— sorry, apprentices— were matched to masters based on their competence (Sanders 39). This air was on a front carrying a wave of unprecedented materialism, placing the onus on business owners to supply the increasing demand for a widening variety of goods, as specialization bred individuality.

If only Plaisterer’s had posted their advertisement merely half a century earlier, then they, rather than the likes of Harding and Stewart, could have altered the landscape of consumerism. In 1665, during the Reformation, the church had briefly restricted women under the age of forty from wearing a certain fashionable black funeral dress, called the Sturtz, if they were not openly mourning, which may have signaled a rift in the search for new fashion trends (Levy 386-387). Pastors requested more pay so that they could afford less raggedy clothing, which translates to avoiding the stodgy dress required by church doctrine (Levy 388). By the first decade of the 1700s, more German women rebelled against wearing the Sturtz, which only hardened the decisions of the religious authorities to regulate what women could wear (Levy 390). They were vastly more attuned to the French ardor for feathers and veils (Levy 170; 392) driven by “urban, military, and courtly elites” (Levy 148), calling for shops mimicking department stores in 1780s France (Levy 146). This restriction expanded beyond the church into the city of Zurich, Switzerland (Levy 392), where in the 1680s, authorities demanded women to mind their trims, ribbons, and ruffs down to the centimeter, to the point where fashionable dress was deemed the work of Satan (Levy 394).

In 1733, author Johann Jacob Ulrich (bearing no relation to the Swiss artist) bashed women wearing the likes of the Bodenkappen and Tächli for indoctrinating an era during which “senseless ambitions and status consumption would ultimately be the ruin of all” and obfuscate class distinctions (Levy 397). This authoritarian take on fashion ultimately swept Switzerland and Germany behind the curve. Until the end of the 1730s, there were occasional periods during which “the authorities responded with explicit morality campaigns against new fashion trends” (Levy 398).

As housing improved, so did the desire for kitchenware and bedding, including utensils, tables, mirrors, and dressers among other newer commodities (Martin [b] 153). This led to people inviting more guests to their house, and of course the kitchen needed to be stocked with the latest dishes, lamps, and card tables; hosts wanted to look sharp as well, so they invested in clothing and accessories in the middle of the 1700s, when of course Ulrich’s prediction would start to reek of truth (Martin [b] 153).

Ulrich was quite off base considering the mercantile milieu across the water in England. Glass wares had carried the Italian market back in the 1480s as textiles would carry the English market; the city of Murano capitalized on this by exclusively carrying crystalline glass exclusively in its own local shops, precluding its availability anywhere else in Italy (Levy 111). Ulrich’s fear of blurry class perforations would also come to light, as for the first time, as decency slowly but surely began to define wealthy and middle class ideologies, even for those living in decrepit dwellings (Martin [b] 154).


The First Industrial Revolution, 1750 to 1830:

Do not take for granted the technologies that we enjoy today, for if it were not for the innovations unearthed during the two Industrial Revolutions, then we would not have our capitalist society with interpersonal communication, intercontinental travel, and international corporations. The date range, of course, is not absolute, as an approximation better accommodates the predecessors and antecedents to two distinct eras of prolific fabrication.

Women were hired to sew uniforms and assemble bullet cartridges in early factories (Kravitz 50) foreshadowing the explosion of mass production which would occur in the middle of the 19th century; women could also manage retail markets and bars, real estate properties, and classrooms (Kravitz 50). Otherwise, women were expected to be compliant and modest in the 18th-century United States (Kravitz 48). Most women sacrificed their identities to subsist. However, a legion of women would, perhaps accidentally, imbue their identities into industrial history through a quaint performance of the one thing which was the most harmful for them to have displayed: their opinion.

Debating societies inhabited London since the 1740s or earlier but truly made their marks in the latter half of the century (Thale 31). The Enlightenment encouraged the individuality of opinion fueling lively debates about controversial topics such as religion, politics, and money (Thale 32). Societies were derided as much as they were praised, but any press was good press as it stimulated their appeal (Thale 32). Even though enlightened sentiments invited women deeper into male society, initially women were discouraged from public debates thanks to the frequent tendency of freely developed conclusions of these men to escalate into remarks of blasphemy and vitriol (Thale 32). Ironically, female access to theaters inspired one debate society, the Robin Hood Society, to extend to women an opportunity to pay to view these debates and therefore bolster its already lucrative reputation, or in other words to capitalize on the subversive nature of women and keep the money that they had been donating to charities (Thale 33). By 1752, the Robin Hood Society advertised themselves to a wealthy demographic under the guise of The Temple of Taste, promising orchestral music and fruitful reflections on astronomy, clothing, and the wisdom of William Shakespeare (Thale 34). They also attempted to appeal to their new female audience by starting the debates with questions about motherhood, marriage, et cetera, refraining from divisive topics which would have spurred brawls (Thale 35). Charles Macklin founded the British Inquisition debate society in 1754 (Thale 37). He would unwittingly presage commerce culture. Exquisite British shops spoiled London’s high society with the anomalous privilege to observe a wide array of products without any obligation to purchase anything in the late 1700s (Lynch 216).

Macklin had essentially designed a prototypical department store by renting multiple rooms in a plaza, serving porter and coffee to his esteemed guests, and, most importantly, selling philosophy, a product of human interest (Thale 37). His public theater would produce the first drops of rain for the perfect storm spawning the development of possibly the world’s first department store. Playing to his demographic, Macklin advertised the British Inquisition as an intelligent society reminiscent of universities and sophisticated ancient civilizations and offered a palatable attendance fee for middle class society (Thale 38). Not to be outdone, he would embellish the number of attendees and debate topics (Thale 39). Crucially, Macklin was the first host to recruit female speakers for the debates (Thale 40), proving his sensitivity to the imminent female demographic.

Scotland implemented an early example of false advertising in 1760 when they postulated a plan to attract potential military recruits to agricultural villages not only to populate the area in general but also to enact “the propagation of a hardy and industrious race, fit for serving the public in war” (Mackillop 175-176). These villagers were to cultivate potatoes and turnips with spades rather than plows in order to swell their muscles for battle (Mackillop 176). The greater availability of housing, not only for the British Isles but also for colonial America, expedited the onslaught of the first Industrial Revolution, as people had more diverse opportunities to “allocate resources” with a caveat of uneven financial improvement (Martin [b] 155). It cost Scotland more than £13,115 (or £1,343,851, which is $1,677,455 USD today) to ultimately house less men than planned on subpar farmland of insufficient acreage (Mackillop 178-179). It turned out that recruitment negatively affected the costs of the Scottish labor market to the point where tenants had to hire servants or sacrifice the integrity of their land to make ends meet. In turn, this only strengthened the appeal of recruitment as more land became available. (Mackillop 279). This served as an indirect help wanted scheme for building the Scottish militia.

Scotland was clearly itching for a solution to produce more vegetables with less people. Humankind’s desire to more quickly produce textiles and recycle coal, not root vegetables, would provide them myriad solutions. In 1769, James Watt introduced the world to his steam engine to squeeze the energy from coal (Chandler Jr. [a] 34). Around the same time, Richard Arkwright devised the spinning frame, also called a water frame as it was powered by a water wheel which would wind threads of cotton into yarn much faster than human hands ever could (Klein). This dynamism followed a reciprocal exchange.

A brief summary is in order. British troops had been patrolling Boston for a long while and a group of Patriots were sufficiently infuriated by their presence. This led to the infamous Boston Massacre; as a result, the Crown closed Boston’s port and installed Major General Thomas Gage as the governor (Quimby 86). He established the Provincial Congress which would strive to pacify Boston’s burgeoning rebellion, but overall he was hesitant to assert himself over the population (Quimby 86). I find it ironic that the revolutionaries were trying to escape the sovereign fist of Great Britain while the British practice of breakfast tea had become so ingrained into nascent American culture by the 1780s that even slaves were allowed to partake in morning tea (Martin 337).

This was the beginning of a materialistic wave in the United States, where even within the rising middle class, “the home had gained increasing importance in social intercourse” and people started to measure their worth against the value of the things that they bought and showed off (Martin [b] 254). While this demonstrates the dawn of materialism and consumerism in American history which longed for a convenient venue, it also confounds the American struggle for independence from their sovereign rulers. In 1773, right before the start of the Revolutionary War, another unsuspecting contender for prototypical department store status came in the form of Walthal Fenton’s tea and textile shop in Newcastle-under-Lyme (Martin 167). He also collected oysters and sent dresses to be dyed (Martin 167). Hundreds of specialty marketplaces popped up across England, including another prototypical department store in Kirkby Stephen owned by Abraham Dent in the 1780s which “sold everything from soap to silk, with the notable exception of salt, eggs, cheese, and butter (Martin 170).

Nevertheless, Germany began recruiting their soldiers “by force, impressment, theft of foreigners, and other means,” including scooping vagabonds from their streets (O’Hanlon 223). The nascent American government likely resorted to flipping through tax records to recruit soldiers and their understudies, determining that those with the most property were fittest for service, if their methodology remotely resembled post-revolutionary drafting techniques (Van Atta 269-270). In a way, newspapers acted like draft applications because farmers who responded the earliest to advertisements for plot rental were most likely to have owned more property (Van Atta 279). Irish immigrants flocked Philadelphia, Pennsylvania during the 1770s, creating an opportunity to utilize a different strategy which would never slide today: recruitment by nationality, via which these “ever ready, faithful, and reliable” immigrants were positioned by the Pennsylvania Line and divided into eight regiments (O’Hanlon 187). Likewise, General James Clinton in New York and Colonel John Stark in New Hampshire recruited primarily Irish-American soldiers (O’Hanlon 189; 227). The Tories tried to nudge Gage with a command to arrest American revolutionaries Samuel Adams and John Hancock, but that was not enough, and they decided to take a more direct approach (Quimby 86).

The Revolutionary War had begun. Everybody at the time knew that this would be memorialized as an unprecedented war as it instantly filled headlines across the nation (Mott 489). Any American who remembers elementary school history class may know that Paul Revere would ride to Concord to warn the Americans that the Tories were sailing across the Atlantic to attack. On April 16th, 1775, Revere told Lexington’s force about his suspicions; hearsay spread this knowledge around; two days later, he proposed his master plan to position two lanterns at the steeple of North Church and one lantern by the Charles River (Quimby 86). The British army confronted American minutemen at the river (Quimby 87) and thus the first shots of the Revolutionary War were fired.

Couriers on horseback could spread intelligence fairly quickly, yet misunderstandings still occurred, such as one time when the Americans thought that the British troops were torching Boston when rather they saw celebratory smoke from their own grenadiers burning tools (Quimby 91). More than likely, they just had not read the latest updates. We tend to trivialize the knowledge that we learned from history classes. At the time, it was quite the puzzle to accurately portray the course of events of an international war. The contemporary equivalent of a wire for news updates consisted of a patchwork of alerts from “post-master friend, or a friend in the shipping trade, who would write occasionally to send routine items, mainly of the sailings of ships, the weather, or the movements of the governor,” or from friends, travelers, or new visitors, with the chain of publication starting from whichever publisher was closest to the breaking story and passing through those who wished to glean details for their own stories, sometimes recording no attribution to the primary source (Mott 490). If these sensational events were not lurid enough to make it into irregular handbills, then they were published weekly by default (Mott 493).

By May 1775, the Boston Committee of Safety, formed after the events of the Boston Tea Party the prior year, founded an independent postal system to spur intercolonial communication during their war against Great Britain (Postal Service 5). On July 26th, the Continental Congress established the General Post Office as they began to recognize the substantial role of intranational communication, with Benjamin Franklin appointed as its first postmaster general (Postal Service 5). Words about the men at battle were about to start traveling at an unrivaled speed to their colleagues, friends, and wives back at home. These words would be marked like the rest of them into accounts of history, which it has been said are usually written by the winners, which are usually men. This sentiment is attributed to Winston Churchill as his response to the Nuremberg Trials, yet it was echoed by Hermann Göring during the same event (Gilbert 4).

We cannot dismiss, however, the role of those wives and daughters in the struggle for independence. A legendary Connecticut woman, Sybil Ludington, born in 1761, has been dubbed the “teenage Paul Revere” (Hunt 187) because in 1777 at the age of 16, half of Revere’s 32, she rode her horse from what is now Westport, Connecticut to Carmel, Stormville, and Mahopac, New York respectively (New England Historical Society). This event is dubious at best, yet its notion at least imagined the full potential of colonial women. One account, from her colonel father, claimed that a messenger rode up to their house late at night and told them to gather the regiment, but since nobody was available that night, Sybil decided to do it herself and save the day (Hunt 189). Willis Fletcher Johnson captured the story of this ride in his biography of Colonel Ludington, which tried to promote the impression that colonels were not recognized enough in history. (Hunt 193-194). This inflated the principle of the capitalist white man valiantly putting England in its place (Hunt 191). During reenactments of the war which took place during the colonialist revival entering the 20th century (Hunt 191), women were eminent participants in erecting statues and tableaux vivants, or “living pictures,” wherein the actresses would star in still images depicting colonial times (Hunt 192). In order to persevere in their effort to preserve colonial history in an era of immigration, they encouraged children, adolescents, and “verified patriot[s]” to attend the festivities (Hunt 192). It is therefore curious that Sybil was mentioned in only two accounts of the war, and a teenager named Mary Ann Gibbes, who jumped through bullets and cannonfire to save an infant, was mentioned once (Hunt 194); worse, even though two books published in the late 1800s reminisced about heroic Revolutionary women, Sybil did not make it into either (Hunt 194). Road markers would line the trail trodden by Sybil’s horse, which may or may not have been named Star depending on the depiction, during the Great Depression (Hunt 198). This would invoke her own historical revival throughout the 20th century. Sybil would receive national recognition in 1995 when the National Assessment of Educational Progress assessment would adapt her somewhat dramatized account to a literacy exam (Hunt 212).

Deborah Samson did more than ride a horse to warn a regiment about the impending war; she disguised herself as a man and entered battle herself (Kravitz 48). Only when she was waitressing as a side hustle during the war was her sex belatedly discovered three years after the fact (Kravitz 48). Samson was not washed away by history; it is apparent that revolutionary adrenaline fogged the sexist portion of men’s brains to welcome women donning male soldier uniforms into the battle for independence and publish their audacity, even though those brave women were shortly shafted back into their subordinate positions (Kravitz 50). For example, Quaker woman Elizabeth Ashbridge had written in her acclaimed autobiography, published in 1774, anecdotes of her transatlantic proselytism (Kravitz 50) while Sarah Osborn’s antipathy toward slavery, pedagogy, unwavering spirituality, and antipathy toward slavery won her community recognition (Kravitz 51) and Phillis Wheatley’s poetry which sanctified Black freedom earned her a stamp of legitimacy (Kravitz 51), yet after the war, the legacy of women was once again reduced to that of a lowly maid (Kravitz 50). In particular, the valor of Samson, this “lively, comely young nymph” (Kravitz 47) was synopsized by a 1784 newspaper article which chronicled her as an exemplary soldier, although the editor claimed that her sex was uncovered in Philadelphia due to “a violent illness,” after which she was honorably discharged (Kravitz 47). The first article added that she did this to escape forced marriage (Kravitz 47). Another account commended the opinion that Samson was only discovered after the war because her fellow soldiers were complaining about their underwhelming pay and got themselves caught in a “bout with ‘malignant fever’” (Kravitz 54). Unsurprisingly, despite her “lecture tour” (Kravitz 52), her name was mostly forgotten for decades to come, in part because the journal that Samson kept for marking her wartime achievements was lost at sea (Kravitz 53) and because she toured under the pseudonym of Deborah Gannett (Kravitz 54). Nevertheless, she more than likely was intelligent enough to outman the men in her Middleborough community, but when she was drunk and blew her cover, she enlisted at West Point in 1782 (Kravitz 52), and her unearthed legacy eventually granted her the distinction of becoming the first official state hero or heroine in the United States, in Massachusetts in 1983 (Kravitz 51).

The department store did not develop in one day. Its genesis was the byproduct of the mass production warranted by the Industrial Revolution. Starting in 1781, in a neonatal United States, the Articles of Confederation granted monopolistic rights to the Continental Congress to “establish and regulate post-offices” in opposition to the British postal tax (Priest 45; 47). This move ironically outvied Britain’s own royal stranglehold. However, in 1789, after the Constitution was ratified, the Continental Congress claimed to have established a postal monopoly because no private service could have done it justice (Priest 49). Periodicals that were not waking up from an indefinite hiatus after the war, such as the Post-Boy and Boston Evening-Post journals in Boston, heralded a new era of press production. London’s Daily Universal Register (The Times,) alongside periodicals like The Morning Chronicle, The Morning Post, and The Observer, reaped the benefits of mail stagecoaches in the 1790s.

Moreover, in 1792, the postal policy was “formally established by an Act of Congress,” and by 1794, the “first letter carriers appear[ed] on [the] streets of some American cities” (Postal 9). The combination of consumerism and convenience essentially behooved Harding, Howell & Company’s Grand Fashionable Magazine in 1796 (yes, that’s the full name of the store which henceforth I shall abbreviate to HHC) (Glancey). This market was “consistently so crowded that its forty employees could devote little time to individual customers” (Lynch 222). Located at 89 Pall Mall Street in London, England, HHC offered four departments collaboratively selling “furs and fans, haberdashery, jewellery and clocks, and millinery, or hats” (Glancey), an amelioration from the simple tea and textiles industry of the recent past.

For women, this was a natural evolution from being exploited to weigh their opinions in a public outlet to being expected to weigh their options in a public retail outlet. Entering from stage left is the July 1784 issue of The Educationist, printed in Indianapolis, Indiana, which contains the earliest evidence that I found in my research of the familiar job application blank, also known as the application form. Prior iterations of the job application process had consisted of someone mailing a simple cover letter or a description of their skills to employers and letting the train conductors seal their fate. This medium was long past outmoded, considering that letter writing as a formal correspondence had arisen in medieval times when students would follow templates compiled by their instructors (Edwards 8). This was the start of a new saga in job applications, when people could simply fill a form of predetermined employment qualification questions with answers that more closely align with the interests of the employer than with the applicant. They could catch the coach of progress that had been chugging along since the development of medieval civilization when governments were behooved to maintain official records (Edwards 10). The Teachers’ National Bureau posted an advertisement requesting qualified educators to “write at once for [their] ‘Teachers’ Application Blank’” (Teachers’ National Bureau). Prior to that, application blanks had been utilized for fraternal group membership, educational courses, and— after the absolute earliest iteration of “application blank” that I could unearth— insurance documents, as found in New York’s The Albion in 1844.

Shortly, applicants and employers would finally share documents that would attempt to plant a middle ground between corporate avarice and civilian anxiety.“Newly affluent middle class women” could explore the shelves and sections and spend their hard-earned money on whatever newfangled fads suited their fancies (Glancey) while debating whether to work there for themselves. Women, and men, no longer had to negotiate prices because HHC introduced fixed prices so that shoppers could spend more time simply strolling around and socializing (Lynch 221). It became a social haven for women to be able to publicly converse with other women (Lynch 222).

By the turn of the century, postal service touched the “farthest reaches of the western and southern frontiers” owing to the Federalists’ unerring resolve that propaganda would rescue the floundering postal business (Priest 50-51). The Leeds Intelligencer, The Yorkshire General Advertiser, and The Ipswich Journal among the aforementioned English periodicals innovated steam-powered production in 1814, delivering articles and advertisements directly to customers (most likely women) quicker (Innis 5-6). All of these newspapers advertised HHC, so it follows that they were ahead of the marketing game. Other department stores such as Thomas Tucker’s, Hanningtons, Binns, Brown Muffs, and Nathan Israel’s all got to catch the progress train, but none left behind a legacy as historical as that of HHC. Meanwhile, in 1812, the United States suffered from a paper shortage, which produced newspapers the size of blankets with exorbitant subscription prices and incessant advertisements (Innis 9). The country would not tarry for long, for some factories were already turning to the next page of the first Industrial Revolution and engineering locomotives. For example, in Marietta, Georgia, Glover Machine Works hired convicts to produce train parts (White 24-25). In 1814, Francis Cabot Lowell had memorized and even improved upon blueprints for Edmund Cartwright’s power loom to contribute to his Massachusetts integrated textile manufacturing mill (Klein). Mills had already proliferated across England and turned Manchester into the first major industrial city (Chandler Jr. [a] 34), yet George Washington espoused the practice of the theft of intellectual English properties because he believed that to reduce the toil of physical labor on his constituents was to, quote unquote, “be of almost infinite consequence to America” (Klein). Alexander Hamilton, the Treasurer of the day, thought that “the development of a strong manufacturing base was vital to the survival of the largely agrarian country” (Klein).

This notion was a reflection of America’s operous upbringing and acted as a reward for the valiant efforts of her workers in contriving the country. Preceding the birth of the United States, colonists followed the “‘sun-to-sun’ rule of work predominated in the new nation, for the hardy background of the citizens demanded long and vigorous toil” while treating idleness as a vice (Cuenin 5). During this era, people did not have much choice of employment beyond yielding to nepotism and working on the farm or in the family business (Sundberg [a]). Child labor was common, with fourteen-year-olds and fifteen-year-olds working on farms and in factories (Lebergott 148; Cuenin 7). I can see how resorting to child labor could have pragmatically bridged gaps between sicknesses, resignations, and terminations, especially in a country still figuring this independence stuff out, but it became an issue. Communities cognizant of the harsh labor practices faced daily by their children constructed playgrounds and advocated for the protection of their precious youth starting at the latter end of the 1800s (Lauwaert 40).

English employers had another advantage outside of the world’s first department store proper, as the dominance of apprenticeships had waned in popularity by the end of the 1700s, leaving potential apprentices to find opportunities from help wanted advertisements and roam retailers for employment opportunities (Sanders 42). This was a welcome change since masters were not always known for their honest business practices (Sanders 47). Shopkeepers swept American shopping culture under American noses from the 1820s onward as they posted advertisements not only in newspapers but also on flyers mailed directly to their customers (Laird 25). I would strongly wager that these retailers, including Arnold Constable & Company (ACC), took full advantage of this phenomenon.

ACC was the oldest department store in the United States as of 1825, first at 156 Front Street and two years later relocated to “the corner of Canal and Mercer streets” with the help of his nephew (Kopytek). At this time, department stores like Lord & Taylor swarmed New York City near Grand Street “for it fringed the residential section of the town of those days” (The Sun). Amidst all of the sweeping developments, small business owners found themselves pleading to expand their workdays to keep up with the factory competition. I would surmise that not as many Americans were interested in working for companies who were actively involved in unionization and protests over poor working conditions. Dissent had begun even as early as 1791 when Philadelphia carpenters went on strike longing for a day restricted to ten hours (Cuenin 4). In 1793 (Cuenin 4) and 1824, women staged walkouts at a Pawtucket, Rhode Island textile mill (Blake 907). In 1828, in Paterson, New Jersey, Philadelphia, and Boston, workers striked for shorter hours and greater pay (Cuenin 5).


The Panic of 1837:

It was 1830. Corporations flourished, railroads frolicked and the Port Phillip District Wars burst out in Victoria, Australia. The first Industrial Revolution was winding down. New York City department stores migrated further north toward Broadway “between Union and Madison squares, which was considered far uptown in antebellum times” (The Sun). Mail delivery smartened as the advent of impulse had incurred criticism of its languid trajectory; after all, those promotional flyers had to infiltrate customer mailboxes as soon as possible.

This provoked postmaster general William Taylor Barry to decree locomotive postal service; stagecoach contractors Samuel Slaymaker and Jesse Tomlinson exemplified this new circumstance by delivering mail on a train in Pennsylvania traveling from Lancaster to West Chester in 1832 (Romanski). As if on cue, Norris Locomotive Works in Philadelphia became a major player in the nineteenth century locomotive commotion, testing designs in the early 1830s and later engineering them (White 64). Simultaneously in 1833, Benjamin Day supplied the country’s first penny press tabloid newspaper called The Sun, which used “simple language” to cater human interest sensations to immigrants (Vida 431). I guarantee that if it had not occurred before, a sharp rise in job seeking occurred as a result of the increased readership. Not to be left in the dust of change, London saw the American penny press and raised them newsboy delivery services in 1833, financially scaffolded by cash from advertising revenues (Innis 9). English newspapers were ostensibly late to the delivery game and had to press on during this race for circulation dominance; advertisements were demoted to later pages as sensational headlines took the forefront (Innis 10). This was partially because they were trapped in a creative fix. Advertising agents had begun to grow weary of commissioning advertisement copy if it was printed on posters and trade cards simply because it was proving to be too lucrative for the actual businesses and not themselves (Laird 158).

The United States was indubitably prosperous by 1835. However, capitalism, consumerism, and convenience catapulted the country into a cataclysmic crash. It had passed through “the deep hollow of a great economic cycle,” losing more than $6 billion USD in contemporary currency (today: $200 billion USD) by 1840 (Rezneck 662). Thus ensued the Panic of 1837. Entire cities, such as Buffalo, New York, collapsed under the financial strain of depreciating property values and virulent bankruptcy (Rezneck 663-664). Rail mail did not back down, though, as in May, John E. Kendall became the first clerk hired to supervise locomotive mail delivery (Romanski). D.J. Burr’s manufacturing firm began producing locomotives despite the Panic that same year; I reckon that this labor likely required more employees than Burr and his three associates (White 38). It seems that the effects on employment did not set in for the first moments of the Panic, yet there was definitely reason to worry as “labor’s loss came chiefly from want of employment and from lowered wages, which created an immediate problem of relief” and buttressed classism, especially in major cities tracing the eastern coast of the United States (Rezneck 664). In August, someone posted a classified advertisement in a New York publication, asking for “twenty spade laborers to do country work at four dollars a month and board;” 500 people applied for the job that same day (Rezneck 664). 90% of factories on the east coast went out of business; I would hazard that the remaining 10% of businesses were busy accepting applications for employment (Rezneck 665). The Erie Railroad, connecting New Jersey to New York, offered to hire “three thousand men, if the city would lend its credit for supplies” (Rezneck 665).

The Panic of 1837 would last four more years. In 1838, Congress “declared that all railroads in the United State were post roads” (Romanski), underpinning the position of rail mail in American discourse. The guinea pigs were tracks spanning between Washington, D.C. and Philadelphia, Pennsylvania, as well as between Boston and Springfield in Massachusetts (Romanski). Concurrently, express firms such as Harnden’s Express and Wells Fargo bloomed nationwide. Wells Fargo ran under the radar by buying bulk supplies of envelopes stamped by the government and pressing their own stamps on them instead; they prospered in the western United States until the government finally permeated the Western market in the 1880s (John Jr. 139). More and more routes were established as a postal monopoly tightened its grip, but this did not stop private express messengers from stealing Congress’ thunder in the 1840s (Priest 59). Almost 3,000 miles of railroad track prime for postal service sprawled the United States by 1840 (Smithsonian). This liaison accelerated agricultural employment by 60% and railroad employment by 300% in conjunction with “a massive population increase— in city slums, in the open country, on frontier farms” (Lebergott 117, 121). Hundreds of thousands of men, including thousands of Irish immigrants, constructed railroads for mail delivery, cargo, and casual travel (Lebergott 191). Dressmaking, hatmaking, and tailoring also enjoyed a commission of more than 52,000 men, and education employed more women as the classroom ratio jumped to 33 students per teacher (Lebergott 142; 143; 201). Fishing, whaling, mining, and sailing came out as honorable mentions (Lebergott 155; 164; 166). Lima Hamilton Corporation in Lima, Ohio, established an incredible proof of concept as they employed 4,300 locomotive engineers during the 1840s (White 52). Also in 1840, Dennis, Wood & Russell in Auburn, New York employed convicts to help produce locomotives (White 43); I guess even prisoners were desperate to be hired during the Panic. President Martin Van Buren, in response to union protests, auspiciously settled on workdays of ten hours in length rather than the contemporary average of twelve, but primarily for “federal employees engaged in manual work” (Ruggeri 140-141), following a lengthy debate for which this post has no room. It did cause a rift in the employment pool, though, which meant that employers relied on the speed of trains and presses to communicate their fresh openings.

Here is my favorite part: reading through archived help wanted postings. For example, in 1838, Andrew Cunningham requested 50 Black and white carpenters and apprentices for immediate hire, promising steady employment and punctual pay (Charleston Daily Courier 3), and the Brattleboro Bookstore in Vermont sought “several active and well educated young men” for immediate application toward employment (Vermont Phoenix 3). In 1840, a Michael Cochran urgently required 200 laborers for the Philadelphia and Reading Railroad, and a Mr. Reeve wanted to immediately hire three engineers and surveyors for Franklin Academy (Philadelphia Public Ledger, page 3). Reverend D.S. Ingraham submitted a letter to Boston’s The Liberator expressing that slaves worked for a fair price once all of them showed up to the site, although they could be easily coaxed into working for any pay rate, but their masters charged an exorbitant amount for rent to make them work harder to pay for their quarters; in England, however, tenants could easily sell their property because it was low in value (Ingraham). Desperate for customers, in 1838, Lord & Taylor promised fixed prices for their products; A.T. Stewart’s managers (Resseguie 310) would follow suit, along with the competition, so that they could afford to hire more salesclerks (Laird 26-27). Already salesclerks were complaining about the anonymity and tedium of the retail environment (Luskey 181). Stewart may have aggravated this opinion with his fixed price promise which served to speed up the transaction process and reduce sensual encounters (Luskey 207).

In 1839, none other than Francis Scott Key, Esquire, the guy known for writing the Star-Spangled Banner poem, penned a passionate letter responding to Reverend Benjamin Tappan’s examination of slavery (Key). Key discussed the state of Maryland transitioning to a free state after recognizing that its border with Pennsylvania saw marvelous progress in that affair, and concluded that if southern states compressed their slavery to the border of the neighboring free states, then emancipation would surely occur quicker; Maryland soon figured that slave labor was no longer a lucrative scheme (Key). I can imply that the eradication of slave labor opened the doors for free citizens to apply for agricultural and manufacturing opportunities. In 1839, Frederick A. Hinton in Philadelphia recognized the high demand of the agricultural industry and promised immediate hire and high wages (Philadelphia Public Ledger 1) and the Brattleboro Typographic Company in Vermont declared that applications from all sources, including the mail, would “receive immediate attention” (Vermont Chronicle 4). In 1840, Senator Richard Niles, described as a radical Democrat, argued that it would cost manufacturers less to let the sub-treasury department handle recruitment as the cost of various commodities was also on a decline, but glossed over the notion that this decision would negligibly reduce employee wages (Craig). In 1840, A Baltimore company wanted “a power loom dresser, a mule spinner, and a machinist,” along with “two families accustomed to cotton works,” for immediate employment (Baltimore Sun 3). In 1841, in Buffalo, New York, a coalition of female parishioners acknowledged that women were suffering from unstable employment and formed the House of Industry for the Relief of Poor Females on Washington Street in order to put them to work mending clothing for “liberal wages;” their article in the Commercial Advertiser and Journal asked for financial and sartorial donations (Jewett 3). In 1841, several firms longed for various types of employees for immediate hire (Philadelphia Public Ledger 3).

One possible cause of financial strain was exorbitant postal rates and inconsequential routes which resulted in boycotts nationwide (John Jr. 139-140). This inspired postal enthusiast George Pomeroy, against the wise advice of his colleague James Hale, founded his own express mail carrying service and incidentally popularized the postage stamp (Bicknell 20). Starting in 1844, hypocrite Hale would devour the northeastern United States and even deliver to Great Britain with his own postal venture (John Jr. 141). Baldwin Locomotive Works in Philadelphia was “the largest and most successful firm whose history spanned the entire steam-locomotive era” (White 29). In the 1840s, the firm had hired under 300 men, but by the turn of the 1860s, 1,700 men worked there (White 29-30).

1842 was a year of relief. In Boston, the Tariff Act revived business prospects and helped to gently lower the unemployment rate after their Society for the Prevention of Pauperism urged beggars and the chronically unemployable to leave the city; years later, reports still accounted for the permanently unemployed (Rezneck 667; 670). That same year, Congress commenced the Federal Bankruptcy Act which they had sanctioned the prior year; this act “finally wiped out four hundred and fifty million dollars of debts, affecting one million creditors” (Rezneck 664). Joseph Story, an associate justice of the Supreme Court, pushed this relief for all debtors, not limited to merchants and creditors (Tabb 17). Democrats opposed the bill, worried that “voluntary bankruptcy was unconstitutional,” while Whigs championed it but disputed whether corporations were eligible for this relief equally to common citizens (Tabb 16).

Ultimately, corporations were excluded in a victory for the multitude (Tabb 16-17). Little exemption was granted, but debtors not fond of the new law could fight the discharge in court (Tabb 17-18). Once the Federal Bankruptcy Act had served its purpose, it was repealed in 1843 (Tabb 18). Hundreds of factories and marketplaces enjoyed the aftermath of this measure because they could now afford to hire more employees. After all, the average salesclerk earned between $200 and $600 annually (today: $7,800 to $23,500 annually) (Luskey 182). Case in point, by 1843, Lowell Machine Shop in the industrial town of Lowell, Massachusetts had amassed up to 1,200 locomotive engineers (Lebergott 128). However, private letter carriers took advantage of this relief, undermining the superior federal operation, until Congress enacted new, still relevant provisions that thwarted the practice in 1845 (Priest 67). Congress also lowered postage rates to prevent any further iterations of private mail carrying schemes (John Jr. 142).

One business owner who struck an opportune moment to open his historically esteemed department store was Alexander Turney Stewart. Parroting his competition, Stewart decided to found his flagship store near Broadway on Chambers Street in 1846, but critics chastised him for situating himself “uptown and on the wrong side of the street;” his decision to build the store on the eastern side of the street, in an overcast spot near the docks, distanced him from the privileged residential area (Domosh 44). Southeast of the competition, Stewart quickly asserted himself as a millionaire mercantile mogul (Domosh 44). He knew that the wider streets were perfect for carriages dropping amenable women at the front doors (Domosh 52). He likened his employees to “cogs in the gigantic machine he was operating,” although he bestowed the full onus of customer satisfaction unto his salesclerks and prioritized hiring the right cashiers and managers of merchandise acquisition, store maintenance, delivery services, and credit services to hold down the fort (Resseguie 314). This philosophy epitomized the notion that the quality of a salesclerk was directly proportional to customer satisfaction in that “the impossibility of separating the two made the store executive’s task far more complicated than that of the factory manager” (Benson 6). I could not find concrete evidence of Stewart applying the ten-hour workday principle, but his employees may have been able to find a way around it to work less.

In Manchester, New Hampshire, the Female Labor Reform of Manchester incorporated the ten-hour workday into textile factory schedules, although a loophole allowed for the distribution of contracts which signed workers on to working longer than ten hours per day (Blake 908). Seemingly, this legislation correlated with an influx of underskilled English immigrants to Boston, New York City, and Philadelphia seeking to escape the crowded workforce of Great Britain as a result of the Panic who ironically plopped themselves into that very same situation, with low wages and living standards; through simply being there to dilute the economy, cost cuts, lower wages, and, once again, poor working conditions ensued (Blake 908). Worse, industrialists worked twelve-hour and thirteen-hour days throughout the country even through the Civil War, and the battle for eight-hour days would be waged for decades thereafter (Cuenin 108). This is not to mention that retail was seasonal due to lulls during the peak of winter and summer and icy throughways and viral diseases in the winter (Luskey 184).

Regardless of the standards to which factory owners dedicated their staff, they found themselves exporting more raw material for Lancastershire’s textile mills and importing British textiles and hardware for new machines humming within the industrial bourgeoisie and proletariat sectors of British society (Chandler Jr. [a] 35). New technology was transpiring so rapidly that farms and corporations believed that anything was possible. In the case of Eli Whitney, inventor of the cotton gin, cotton farmers had essentially asked him to invent it because they were able to project its indispensable value for the inchoate cotton farming in the north (Chandler Jr. [a] 36). Stewart was one of those intuitive businessman, abreast of the stock market and consumer psychology; his store’s ornamental dome and rotunda combination was conducive to publicity, and as such it would “attract customers at the same time that it would add cultural legitimacy to the commercial impulse” (Domosh 55). His market did not resemble a house as did most earlier precursors to the department store (Domosh 53); through his doors, customers would preview a new world of commerce. He hired hundreds of multimodal salesclerks for his establishment by the end of the 1850s (Luskey 178); the fact that out of the 2,500 total salesclerks that Stewart hired before his death in 1876 almost 300 of them lingered for at least a decade or two expounds his prestige in the game (Resseguie 315).


The Second Industrial Revolution, 1850 to 1914:

The United States was late for the second Industrial Revolution. Taking a break from stealing British blueprints, they blocked British imports by closing off “the anthracite fields in eastern Pennsylvania” leading from the ports (Chandler Jr. [a] 36-37). The corporate web stretched to ensnare more “executives, senior managers, managers, assistant managers,” and more. Disappearing were the days of nepotism featuring only “one or two salaried managers” and a single superintendent tending to a plantation or factory producing a single product at a time (Chandler Jr. [a] 37) because companies had to start hiring people based solely on specialized skills (Sundberg [a]). Classified advertisements became expensive, however, and by the 20th century, companies would have to hire recruiters to thumb through the hundreds of resumes that they would receive (Sundberg [b]). Ironically, even though travel was faster and easier than ever before, candidates still “had to type out a cover letter and mail hard copies to each company,” which was time-consuming and led people to applying to local jobs “within their specialized field” and staying there for decades at a time (Sundberg [a]). Distribution, not transportation, was the zeitgeist of this second revolution.

The corporate hierarchy was a monumental byproduct of this second Industrial Revolution, even though archetypal structures had existed in the 1500s when Italian manufacturers would apportion the construction of commodities, such as furnaces, among multiple workers who would gain experience by watching and subsequently participating in the craft (Levy 104). Echoes of modern day consumerism bounced around the walls of Italian glass guilds as they sold impressive stocks of glass wares (Levy 65-66). At the time, workers of any level were always temporary, only working for up to a year at a time, and their unique experiences at one shop would make them that much more suitable for employability at another (Levy 105-106). French featherworkers in Nuremberg could only hire one apprentice at a time for up to five years, and could not hire another within one year of the employee leaving their shop (Levy 151). Account books recorded the exhaustive collaboration of suppliers and shopkeepers trying to satisfy slews of customers well into the 1600s (Levy 330). By the 1620s, shops in Seville sold “textiles and haberdashery supplies sourced from far and wide” (Levy 331).

Although Stewart was not the only exemplar behind the Industrial Revolution, he was cognizant of the predominant market. While most small markets resorted to annexing neighboring houses and installing larger display windows for their expanding inventory (Lauwaert 26), Stewart armored his department store in cast-iron in the 1850s (Domosh 48). Not to be circumvented, Constable upgraded from brick to marble and invested in a mansard roof and a “cast-iron facade” which certainly made ACC a magnet for reckless spenders (Domosh 48). Some department stores incorporated “nurseries, children’s theater, and even miniature indoor zoos” to attract parents and their spawn (Opler 80). Others exploited the domesticity of mothers and beguiled them and their children with a spacious, regal design that women “interpreted as elegant satellites of domesticity in which [they] had a measure of control, exuding the cozy sentimentality associated with the middle-class home rather than the rationality of market transactions” (Luskey 199). Soon, merchants would need to bulwark their growth by sustaining a corporate hierarchy.

By this time, railroads were a mainstay for postal service, as “spending for railway mail grew from $1,275,520 in 1852 to $2,310,389 just four years later” (Smithsonian). (The 1856 expenditure is valued at almost $82.5 million USD as of June 2023.) During the Civil War, “all mail in transit began to be distributed in railroad cars,” not exclusively mail for certain checkpoint locations along the lines (Romanski). The modern hierarchy initially functioned in the 1850s to monitor railroad tracks spanning multiple stations before it balanced the power structure of retailers and other corporations (Chandler Jr. [b] 474). “Labor intensive” enterprises hired more employees while “capital intensive” enterprises focused primarily on improving their machinery and processes in order to achieve more lucrative production (Chandler Jr. [b] 480). Over time, more retailers preferred the latter descriptor as profit dwarfed personability and self-subsistence became the end goal in the 1850s (Luskey 175).

Outside of the railroads, retailers were producing more product than they could sell and started to establish their own wholesaling and raw material acquisition departments (Chandler Jr. [a] 39). While wholesalers were trying to catch up with manufacturers by devising the “multiunit, multifunctional firm,” retailers packed product demonstrations, credit, and repair services into their bag of tricks (Chandler Jr. [a] 39). Their employees received cultural praise as they worked harder, grew to own rather than simply work for markets, and embodied “the economic hopes and dreams of an optimistic era” (Luskey 175). In spite of this, it was hard for them to climb the corporate ladder unless they were privileged enough to be an urban citizen or born into a merchant family (Luskey 177). This was especially the case for the rare Black salesclerk who were too good to be mere porters (Luskey 187), although several advertisements did not distinguish between the responsibilities of porters and salesclerks, forcing applicants to clarify that they wanted to simply be productive in their determination for social stature (Luskey 195).

In the 1860s, farms were encouraged to join these proprietors of capitalism through the foundation of local granges; in Minnesota, for example, Oliver Kelley, founder of the National Grange, and F.M. McDowell, one of his officers, scoured local neighborhoods for recruits (Blanke 172-173). Additionally, the western United States wherein most of these farmlands prospered were slower to indulge in consumerism because small town general stores ran the market (Klassen 672). 72% of retail salesclerks were rural, and most of them were ethnically diverse immigrants (Luskey 176; 186). These communities were more interested in distributing enough supplies and product for their local markets than shipping out to as many customers as possible (Klassen 673). After the Civil War, farms got absorbed into the mercantile business when those Granges hired salesclerks to help state agents fulfill orders for agricultural tools, dry goods, foodstuffs, and coal among other commodities (Blanke 178). While Midwestern farmers were generally self-employed (Lebergott 137), they began to rely on suppliers and “communal solidarity” for supplying the increasing demand for products (Blanke 369).

As retailers crystallized in cities, rural citizens were fond of a “unified consumer ethos,” because they were less inclined to “define their very sense of belonging by the very goods which they consumed” (Blanke 219-220). Despite that in the 1870s these small western nepotistic markets shed negotiation, conservatism, and mediocrity in favor of the mechanical mannerisms of their eastern counterparts (Klassen 675), urban advertisers neglected their Granges and newfound interest in consumerism (Blanke 221-222). This is true despite how mass retailers found it frugal to dispatch their inventory to small family-owned firms who simply did not have the risky minimum efficiency scales, labyrinthine means of supervision, and massive facilities like the big guys, as this “reduced unit costs of distribution by increasing the daily flow or throughput within the distribution network” and challenged the necessity of a wholesale middleman (Chandler Jr. [a] 491). Corporations cranked up their advertising prowess as urban dominance influenced small businesses in the most pastoral of rural towns, such as Oshkosh and Ripon, Wisconsin not too far from Milwaukee, to turn their attention toward the raving city dwellers, tearing the knit community that they had once formed (Blanke 223).

For example, throughout the 1870s, one company, Enoch Morgan’s Sons taped up window lithographs, handed out trade cards to streetcar drivers, and even composed jingles for their Sapolio soap brand (Laird 180). Bucolic towns such as those acquired a paltry percentage of advertising attention in the big city (Blanke 226), demanding a dreadful task of their citizens to endorse their worth to metropolitan employers and recruiters. Perhaps it was advantageous for these community members to alienate themselves, as they ought to have protected their artisanal businesses from corporate buyouts integrating them into their mass production scheme “of scale and scope” (Rogers 14). An evolving definition of “work” pitted farmers against the elitist philosophy that mental labor was quickly superseding manual labor because “the character of a hard-working man was as reliable an emblem of republican citizenship as hard work with the hands” (Luskey 215).

Meanwhile, farmers could not catch up to their urban counterparts as, in 1879, the United States Postal Service (USPS) introduced the concept of honoring discounts for businesses who mailed advertisements, magazines, and newspapers in bulk (Ryan 69). There could not have been a better time for retailers to distribute their advertisements to the masses to fill more job openings and profit. By the 1880s, the United States disbursed weekly newspapers among large cities with more advertising, thanks to the clamor for Civil War bonds (Innis 22-23). Department stores such as A.T. Stewart and Wanamaker's were wise to have hired advertising consultants to reach female customers and employees (Innis 24). Firms who could not justify hiring specialized agents recruited freelance advertisement writers (Laird 181). They “frequently hired boys to paste or hand out broadsides and handbills and even chromo posters, all over cities, often covering each other’s postings by way of competition” (Laird 84).

In the 1890s, department stores incorporated savings banks into their advertising strategies, claiming that customers could take advantage of competitive interest rates on their deposits and “draw on their funds for purchases at the store, a service that was supposed to especially attract customers who lived far away by allowing easy mail-order purchases with no need to transfer funds” (Osborne 225-226). Plus, the male magnates relied on “neatly dressed, polite women who would sell mechanically and inoffensively” to reel in customers (Benson 6). Department stores could profit on receipt paper as it was 2% more valuable than were their interest payments (Osborne 227). Legislators and scholars saw through their schemes of making customers believe that their money was going to be as safe at their ATM as it would have been at other banks (Osborne 228). Department stores dug their nook in the minds of Americans when they absorbed the banking and advertising industries as they, along with most other major businesses, began opening their bureaucracies to specialists in marketing and production as well as “efficiency experts” (Laird 213). European countries, on the other hand, were inhospitable to this mercantile beast and reticent to charge customers with credit rather than cash because that had already been standard practice for them (Lauwaert 27).

By this time, scholars observed the paradigm shift that had just occurred, noting that had anyone tried to apply as a copywriter, asking for a salary of between $10,000 and $25,000, even ten or twenty years prior, they would have been shunned (Laird 230). (today: $2 million USD.) In 1895, Lord & Taylor espoused the plea of advertising professionals that companies should hire out for experts in order to save money and uphold the axiom of “every man to his specialty” (Laird 207). Western department stores, such as Schuneman & Evans and T.C. Powers & Bro did start to network with an expansive array of distributors to catch up with the east (Klassen 684-685). For instance, Thomas Charles Power offered “handsome salaries” to managerial candidates and “some ownership in the store-related enterprises” to potential middle managers in order to find quality employees (Klassen 688). The sibling team diversified their product line to gratify families and evidently converted mothers and wives into saleswomen and managers of millinery and dressmaking departments (Klassen 692). Department stores were the most venerable retail employment options for women since their male supervisors at least tried to empower them by divulging their charm and expertise, even though nowadays that would possibly come across as sexist (Gleason 164-165). Incidentally, another industry, direct sales, was attempting to leverage the temerity of their female agents; even though these women were less subservient, to their chagrin they presented themselves as undignified in comparison to salesclerks (Gleason 164-165). This industry also pulled in families, but this time for soliciting rather than purchasing their products (Gleason 152-153).

In 1896, advertisements challenged sensational news content for their coveted front page spot following the debut of the Daily Mail (Innis 13). In 1897, thanks to rural mail delivery, newspapers finally landed on doorsteps every single morning and reached potential customers and employees far and wide (Innis 25). This was prime time for help wanted advertisements, as now businesses could reach suburban citizens as well as the urban population (Ryan 10). By the turn of the 20th century, advertising agents were in demand and thus had to start recruiting (Laird 310). The progressive school of thought maintained that recruits needed a formal education in the business of marketing to catch up with the industries of medicine and law (Laird 320). This obscured the viability of hiring copywriters rather than relying on advertising agencies to do the dirty work for them.

One philosophy in the 1880s held that firms should bestow offices to dedicated advertising experts who would sneak into their publications articles glorifying the ubiquity of marketing in the lives of both consumers and publishers alike (Laird 336). For example, in 1880, department store founder John Wanamaker had been the first to hire a full-time copywriter specifically for promotional and marketing content aimed at loyal customers (Laird 173). Wanamaker’s outstripped the competition with mailed promotional flyers, newspaper spots, and targeted advertisements penned by private copywriters in their arsenal of marketing tactics. Meanwhile, many journalistic firms were less prone to hire specialized copywriters for their campaigns because, while it was indeed inexpensive to double the duties of a current employee, it was even cheaper to turn to a legitimate advertising agency in order to minimize their overhead costs and maximize their flexibility (Laird 230-231). After all, trade associations overseeing these corporations “codified a new managerial wisdom which framed the problems, possibilities, and goals for each industry, shaping both the consciousness and the actions of its executives” (Benson 2). While retailers circulated copy, direct sales firms recruited directly from colleges and universities in order to plume their clean image; these students had aspirations to grow into successful sales agents and imbued a sense of passion into the craft unlike “grown men peddling inexpensive novelties” (Gleason 185). Another advantage over urban retailers was their rural demographic of university students looking to enter lucrative careers to help their families thrive and bolster their professional status (Gleason 142). High school students also entered this field to preview adulthood and contribute to their families (Gleason 154).

By the 1890s, direct sales was an increasingly competitive field (Gleason 39) and retail was a monolith of its own breed. These differing business models were similar in that, built with a hierarchical corporate structure, both prioritized hiring women. The Keystone View Company in particular gained enough notoriety by 1900 to develop an international framework of in-house photographers, manufacturing and printing managers, distributors, suppliers, and of course agents, securing their relevance in the vertical integration game (Gleason 52). However, the low pay and long hours of retail positions, not to mention the paradox of behaving “in ways that were grounded in their own cultural background, but that offended their employers and customers alike” in a sphere mostly populated by women anyway, “drove middle-class working women into other employment” (Benson 7-8). How were women to recover?

Now enters a second version of the modern job application form. More thorough than cover letters for skimming the applicant pool, weighted job application blanks, or biographical data (biodata) questionnaires, rank each criterion on a scale, where the highest score proves most instrumental toward reflecting an applicant’s perceived compatibility (Farmer 1). The Chicago Underwriters first tested weighted job application blanks in 1894 for the selection and screening of life insurance agents based on “a series of standard questions assessing key elements of an individual’s life experience,” including address, previous work experience, and marital status (Farmer 1). This document would become a weapon in an ongoing battle for vocational supremacy waged by retailers, farms, direct sales agencies, and other incipient fields. The Keystone View Company, Underwood & Underwood, or L.R. Bailey could have benefitted from this blank when retailers such as Sears Roebuck listed their signature stereographs in their catalogues as early as 1897 (Gleason 54). By 1899, Sears created the “stereopticons” department to steal more business from the sales agents (Gleason 54). To account for this diversified customer base, department store managers, mustering haughty supervision techniques, increasingly sought employees skilled in social interaction who would encourage regular customers and call them “guests” as they thought that a wife should have treated domestic visitors (Benson 3-4). In an equally patriarchal manner, department store culture fabricated the “shopgirl” apotheosis which acted as a scapegoat for the inability of management to get customers to splurge, as apparently women were not skilled enough to upsell products like men instinctually could (Benson 5).

Perhaps the retail environment subdued these skills, as women working as direct sales agents ironically earned a much greater number of complete sales than did men (Gleason 169). Moreover, Keystone employed agents from Bulgaria, Canada, China, France, Germany, India, Italy, Norway, Russia, and Wales in order to geographically dominate the retail industry (Gleason 76-77). Domestic retailers expected their female salesclerks to possess the social skills necessary to compassionately upsell to customers to emphasize their local rapport, but this backfired, as these same women inadvertently learned how to retort to their male bosses (Benson 22). Women could have benefitted from the weighted application blank’s ability to outline their other skills to various employers outside of retail and sales, but at least they were well past the days of haphazard hiring practices and “de facto apprenticeship system[s]” which polarized new employees (Benson 23). This partially may have been instituted in response to the classist identity crisis faced by salesclerks back in the 1850s when they were ostracized from the middle class and began to act out; they very reluctantly assisted Black porters with their menial labor and initiated unsanitary relationships with female customers in order to rebuild their fragile masculinity (Luskey 197). Since department stores wanted to avoid repelling customers, owners went all in on courteousness and providing service with a smile. Astoundingly, Keystone only advertised its new products in local newspapers, not itself as an entity, such as their competitor Underwood & Underwood, and did not venture into the world of mail-order catalogues as did Sears; this dulled their edge, yet they still managed to make their mark (Gleason 206).

As much as women struggled to find their voices but could justify diverging from a cultural norm for the sake of their careers, their own children lacked any opportunity to speak out about their own labor conditions. Even into the 1900s children were still worked to the bone at factories, until staggered compulsory school attendance laws legitimately quelled the severity of child labor, at least in the United States, by the early 1900s (Lauwaert 23). Considering the protraction of consumeristic advancements and retail conquest, it was apt for students to stop working and learn about how to get a job the right way. Frank R. Moore, an educator and author, published an article in the Journal of Education in 1902 about the ideal English class for elementary and middle school students, during which students would learn how to compose proper sentences, paragraphs, and casual letters (Moore 203).

Moore was most concerned, though, that teachers were neglecting to instruct students on how to compose formal correspondences including, of course, letters of application, on the proper leaf of paper, using the correct folding techniques. Moore’s criticism reflected academic disagreements on how to send formal correspondences; furthermore, teachers and students alike dreaded the practice of composition. Students sat through mundane lectures and rules that lacked humanistic expression. Moore advocated for students to acquire “a broad, rich, useful vocabulary that will have a close relation to the needs of the child in his language expression and lead to a discriminating use of words” marrying disparate thoughts for real-world operation (Moore 203). To Moore’s delight, this was around the time when the University of Illinois and New York University would pitch business education in the United States (Edwards 66). His idyllic fifth graders versed in informal letters, letters of application, and letters of recommendation would find themselves in good hands by the time they reach their twenties (Moore 206). Seminal application blanks featured indistinguishable items starkly capturing their purpose (Bloomfield 126). The New York State Nurses’ Association in 1903 demystified their new nursing application process of verifying their ownership of their written answers and undoctored photograph and allowing their references to personally testify to their experience as letters of recommendation were redundant (Nursing 1903 218). A year later in New York, nurses were still acclimating to the process, submitting “careless” blanks to the board that notably lacked sufficient evidence of their references being real people (Hitchcock 174). The board informed nurses that the superintendent had to sign off on the application blank in lieu of attaching a diploma, as “unscrupulous or a low standard of integrity can quickly overturn what years of education and intelligence cannot replace” (Hitchcock 174). Adding insult to injury, they called in notaries to impede “unworthy” candidates from earning a nursing registration (Hitchcock 175).

Meanwhile, in Colorado, Maud McClaskie described a weighted instructional regimen for screening out nurses so that the general public could be safe from “unscrupulous pretenders” (McClaskie 843). The Colorado board only registered graduates of a certified hospital or medical training program, who were at least twenty-three years of age and had submitted an undoctored portrait, a payment of $10 (today: $320, plus tax), and most vitally an application form with their name, address, educational background (plus notarized proof of a diploma,) and nursing background, of course; board members kept personal information anonymous until all of the applicants were ranked (McClaskie 844). Specifically, the Colorado board favored “civil services” standards for weighing examination scores and educational and nursing background on a scale from one to ten, with 6.6 and above designated as the passing grade; in spite of this, nurses waiting for their final scores could provisionally practice professionally (McClaskie 844). Application blanks, only some decades old, promptly won the hearts of hawk-eyed managers and supervisors who wished glory unto their applicants.

I presume that Moore was raised in the midst of the second Industrial Revolution to believe that early indoctrination into the workforce would comfortably relegate the unemployed masses into a shifting labor supply and maintain the United States. He wanted to prepare young minds for the reality that promotion rather than innovation illustrated the average American success story and their fate could extend beyond menial labor. These students would find themselves as young adults living in the Great Depression testing their correspondence chops. They also got to jump off swings and ride down slides when the country promoted physical fitness and a proper education in response to the call for “educated laborers” at the turn of the century (Lauwaert 35-36). Fittingly, some American department stores “sported rooftop gardens and playgrounds” to further stimulate this younger crowd (Lauwaert 27). Their toys invited boys to the worlds of science and manufacturing, whereas girls saw department stores as a shopping tutorial (Lauwaert 39). They also charmed the upper class by introducing welfare work by 1910, which included medical programs that would certainly “attract the elusive ‘better class’ of salesperson to a store that was a more attractive and respectable place to work” (Benson 14).

Now, we travel to Boston, in 1911, when the Vocational Bureau of Boston assembled a coalition called the Employment Managers’ Association in order to gather experts to deal with the novel problem of hiring, monitoring, and promoting qualified employees, as well as to discuss the foundations of a particular employee department (Bloomfield 121). Despite countless experiments, there was no solution (Bloomfield 122). Because of this conundrum, they predicted, correctly, that a nationwide trend of forming similar coalitions would be “fundamental, and in accord with the aims of both industry and social service” (Bloomfield 121). Six years later, the United States Department of Labor would schedule a conference discussing this issue in the more specific context of employee turnover rates, and also conduct experiments concerning the cost of employee turnover. More corporations were divulging the responsibility of hiring to specialized managers rather than department heads; Bloomfield was so passionate about the gravity of this matter that he conjectured that “not everybody can or should hire; not everybody can supervise men” (Bloomfield 123). It was still debatable whether there could ever be a standardized process for qualifying candidates (Bloomfield 124). One impetus of this issue was the labor reserve resulting from the influx of factory and management openings that appeared during the peak of this second Industrial Revolution (Bloomfield 125). Their best bet was to recruit college students into probationary internship programs during weekends and breaks and document their aptitude for the positions in question, yet the coalition still had no grasp of how their skills actually coincided with their job requirements (Bloomfield 125-126).

Clearly, a solution had to be found; since the domain of recruitment had reached such a precipice of urgency in the wake of the corporate explosion, the pursuit for acquiring the perfect employee ignited to wildfire proportions (Stuit 28-29). A slew of specific vocational assessments for the fields of dentistry, pharmaceuticals, law, engineering, veterinary medicine, and more introduced high schoolers to the real world during this time as well (Stuit 33). The United States was truly starting to realize that their children were the future generation of business owners, farmers, and teachers, so scholars and politicians started to pay attention to their treatment. “Errand boys” were males under the age of sixteen who were not in class on account of 1916’s new child labor law, which stated that minors were now required to attend school for at least “eight hours a week.”

In relation to Moore’s propositions of business education for children, Huey comments that these poised adolescents created a new labor market ripe for exploitation. Child labor was starting to let up, but that did not mean that there was no more labor abuse among adults. Two decades later, in 1938, the Fair Labor Standards Act would attempt to abrogate abusive labor practices in the United States. For the moment, however, Huey recommended that managers should organize their records of letters of application to prepare for interviews (Huey 210). On that note, applicants were to write at least their ages, educational backgrounds, and previous employment experiences on their letters of application; handwriting even served as a factor in the “mental qualifications of the candidate” (Huey 212). Managers were then expected to stow away rejected applications into a separate folder until the applicant in question became more fit for the position (Huey 215).

The aspects of handwriting and educational background were particularly important to scholars in the 1910s. Charles Rounds, the head of the English department at Milwaukee State Normal School as of 1915, was less than pleased with the current condition of English education. Teachers were not grasping the fact that their students would teach students one day, so they needed to be “efficient, purposeful, useful” individuals with the skills of self-criticism and productivity (Rounds 68). Worse, students were conflating “its” and “it’s” and did not know how to ask for information or compose “letters of application, letters of complaint, or letters giving information to members of their school board” (Rounds 69).

It seems that writing was a real enigma in the early twentieth century; it is a wonder that these students made it into adulthood, and that their teachers were even qualified enough to teach them at all. Tied in with the ability to apply for jobs is the resilience to deal with rejection and failure; waxing philosophical, Rounds wisely added that we feel better when we feel that we are overcoming daily obstacles and even remotely making progress (Rounds 69). Managers were more selective than ever before, and young adults needed to prepare for the truth that their skills alone may not get them anywhere. The right employee always meant “in reality a potential vacancy to the employment manager” (Huey 208), so they were always on the lookout for replacements.

That being said, there were few professors who knew how to articulate this concept to the young adults ready to hear it; pioneering formal writing courses prioritized rhetoric and mechanics rather than communication in and of itself (Popken 97). While Sherwin Cody had published a book How to Deal with Human Nature in Business in 1904, Cody had only separately discussed the matters of sending letters to firms using two-cent postage (Cody 351) and how sales managers recruited customers and potential salesmen with connections and a handy phonebook (Cody 414-415), without piecing together these two elements of the letter of application into one solid description of that document To his credit, Cody did address the employer perspective, which was doubtless an anomalous feat (Edwards 63), yet Cody was an amateur, not an expert, on the subject of business correspondence, as a prolific enthusiast of astrology, home construction, and hypnotism among other topics (Popken 99).


World Wars:

Between 1910 and 1940, the number of business schools in the United States increased tenfold, with professors, like medieval masters, lecturing about the templates and properties of business correspondences in a simulated business environment rather than contextualizing it into practice (Popken 96). Early innovators in this domain included George Hotchkiss and Edwin Hall Gardner, the latter of whom would publish in 1915 a book called Effective Business Letters, which was “the first college-level text that focused solely on business writing” (Edwards 66).

Gardner was one of the earliest authors to instruct readers on how to write a letter of application. He recommended that applicants should write in accordance to the specific needs of the employer, avoiding what he called the “announcement style,” which would bluntly state that the applicant is interested in the position and wants to start on a given date (Gardner 307). Enumerated qualifications relevant to the employer’s needs introduced the paragraphs of Gardner’s ideal letter of application to ensure an outstanding first impression for the recruiter or manager, peppered with refined expressions of “reliability, experience, and originality” (Gardner 308-309). He proceeded to outline the sections as follows: how the applicant had learned about the position and its requirements; their prior employment experience; personal qualities that make the applicant stand out from the applicant pool; and “the clincher,” or simply the applicant’s phone number or desired interview setting (Gardner 310-312).

In a modern context, these application letters would equate to cover letters, not job application forms or resumes, especially because resumes and job applications are submitted as separate documents to extant hiring managers. However, in the 1910s and 1920s, the prevailing academic persuasion held that resumes were subcategories of employment applications (Popken 100). The professional workforce anticipated a merge of typical resume information with that of a letter of application into one streamlined document. This separation of resume and letter of application would materialize in the mid-1920s; professors and authors would claim this autonomy in the 1930s (Popken 103-104). A subsequent revision of the letter of application called for bulleted phrases about the applicant’s background, as for the recruiter, less meant more (Popken 102).

Why, many wonder, must we cut the information from our resume and paste it all to our applications? Can managers not gain enough information from a resume alone? No, legally they cannot because while job applications are legal documents, resumes are not, because they are simply classified as a “self-marketing document” (Kennedy F2). Submitted application blanks enter an applicant’s permanent file that has been following them since middle school; businesses can either write their own job application forms and questions with the supervision of their legal team “to avoid violating civil rights statutes at federal and state levels” or order them from vendors (Inc.). Applicants must certify the veracity of the information that they write on their application blank since false claims can get them fired or dismissed from consideration (Kennedy F2). On the other hand, resumes “have no statement of fact under penalty of perjury” and do not authorize background checks (Simpson). Furthermore, job application blanks differ from employment contracts, noncompete agreements, and independent contractor agreements alike (Varga 64).

Since applications are born from legal jargon, their content is “highly confidential” and never shared with third parties (Varga 75-76). One infamous case of job application fraud involved Julie Swetnick, one of the women who accused Supreme Court nominee Brett Kavanaugh of sexual assault; her past holds a lawsuit in which one of her former employers accused her of stretching her tenure with a previous employer and flaunting “an undergraduate degree in biology and chemistry from Johns Hopkins University” (Kunzelman et al A4). Remember that job applications alone do not warrant background checks; she could have gotten away with it if she had not been ensnared in Kavanaugh’s life. This case amplified the indispensability of job application forms in a just American society. However, this separation of resume and job application would discourage thousands of applicants from finishing the online process unless job boards asked for “minimal information” and called for more as needed (Maurer).

What standard must these employers follow for their own application forms? There is no singular standard (Varga 77), yet economist Royal Meeker in 1917 shouldered the standardization of the job application form (Meeker 116) during his mission to resolve an endemic of miscalculated employment and turnover rates across the United States and in the military (Meeker 16). Meeker found that factories were not properly tracking their employees, even with the assistance of new hiring managers (Meeker 5). His bookkeeping measure would strive to increase productivity by at least 60%, leave time for longer breaks, increase the remunerability of recruitment, and stimulate employee morale, which had for the longest time been undistinguished (Meeker 5). He proposed a turnover record system which would graphically account for “the parallelism between high turnover and low efficiency” of each employee by recording his “earnings and bonuses, defective work, absences and tardiness, his complaints and those charged against him, a periodic certification by foremen, and, when he leaves, his apparent or declared reasons for going” (Meeker 35). While he agreed that it was good for factories to have a backup plan for employees who got sick, fired, quit, or died, he thought that it was impractical for them to have hired more than capacity (Meeker 16). Hiring managers, in his opinion, were level with production managers, and needed to possess the character conducive to recruiting and shaping the right “human material” into their respective positions (Meeker 23).

Assessments conducted after the application was submitted were as valuable to Meeker as interviews, bookkeeping, payroll, and marketing, even though its cost depended on the “experience and skill” of the new hire (Meeker 20). Information gathered from psychological assessments gauges an applicant’s personal identity via the dispositions that they identify when they answer generalized weighted questions pertaining to the “big five model” of personality (Farmer 8). He did not condone psychological tests, but rather assessments of how each recruit would assimilate into each department (Meeker 184). He approved of home inspections, physicals, and background checks as well (Meeker 36-37).

In the proceedings of his conference, Meeker buries a treasure for us to marvel: the first modern job application form in all of its glory. Of course the government always has the new technology before the masses get to hear about it. This document (Meeker 181) explores the recruitment journey of John Sobritski, a 33-year-old man residing at 4623 Milnor Street in Philadelphia, Pennsylvania, only 2.7 miles from his prior workplace of six months called Henry Disston & Sons (currently standing as Disston Precision Incorporated) in Tacony, from which he had quit because he moved. He applied for an unspecified position on February 24th, 1917. Sobritski neither spoke nor wrote in English on account of his Polish heritage. Outside of Disston’s factory, Sobritski worked for one year at a spring shop for scant compensation, six months at a dissatisfying manufacturer, two months at Germantown Tool Company which was too far of a commute, and two years at Fayette R. Plumb before he relocated to Germantown. He expects “piece work” of a salary over the $11.50 to $20.00 earned from his past experience of polishing. At Meeker’s company, Sobritski was a “good polisher on edge tools,” but he smoked in the grinding room and got himself fired.

This application form laid bare potentially an entire lifetime of work in an organized fashion that discouraged feeble “floaters” who did not want to work on factory yard tasks such as “trucking, grinding, tempering, polishing, and heating” (Meeker 184). This application form freed supervisory staff of the responsibility of handling much of the screening process, although it hurt their pride (Meeker 179). In Part Two of this exploration of the history of the job application form, I will discuss why most of these questions would definitely not slide on a modern job application form. In short, Meeker’s screening process reeked of discrimination, as he scouted for laborers of certain ethnicities, ages, heights, genders, et cetera. Ironically, this process was meant to eliminate “cliques” of people sharing nationalities and creeds (Meeker 180). This practice does coincide, though, with a myopic requirement that had been proposed by the Civil Service Commission (CSC) to attach photographs to job applications; this requirement intended to diversify labor opportunities for Black workers but ultimately emboldened racist employers to discriminate against them simply for their skin color (MacLaury 6-7).

Prior to this form, the supervisory team had signed a requisition slip confirming open positions; after Sobritski submitted the application, the company maintained his record card and delineated his wages, employment statuses, and demographic information to weed out the “loafers” (Meeker 181-182). The employment department would interview employees once a month to check on their wellbeing within the company (Meeker 85-86). They stowed dismissed applications in a separate file for later when they would need someone immediately (Meeker 99), as per Huey’s foregoing suggestion. This accordingly reduced outside hiring (Meeker 121), which resets Sobritski’s circular quest for employment at a different manufacturing establishment.

Despite Meeker’s dismissal of employment assessments, they would still start small, in law classrooms at universities in the form of aptitude assessments. The Dearborn Group Test, first dispersed in 1920, was a problem-solving assessment for which subjects had to arrange a set of wooden blocks in a quick and efficient manner to solve a logic puzzle (Sells 40). 1923 saw the launch of the Inglis English Vocabulary Test, an assessment mainly for high school students who had to study 150 fairly common but not everyday vocabulary words (Sells 40). By 1925, studies determined that clerical employees were less likely to quit their jobs if their fathers worked in unskilled labor (Hom et al 530), but ironically, some believed that the sheer monotony of typing, filing, or assembling contributed to turnover rates (Hom et al 532). Regardless, Meeker would have been wise to follow this area of study that discussed different ways to mitigate turnover rates such as offering pay raises to imminent quitters, calculating ways to reduce company costs, or granting their expertise to researchers of the subject (Hom et al 530).

Adjacently, the Henmon-Nelson assessment used the groundbreaking “Clapp-Young self-marking device (Clapp 305) which require[d] no separate scoring key” because carbon encased inside of a perforated sheet captured the responses (Reeds 5). Saturated with ninety questions pertaining to “vocabulary, sentence completion, word classification, logical selection, disarranged sentences, interpretation of proverbs, verbal analogies, mixed spelling, series completion, design analogies, and arithmetic reasoning” as it was, this assessment was designed for measuring academic achievement rather than vocational capability (Reeds 6). However, employers, such as those screening clerical applicants for municipal positions in Decatur, Illinois in 1925, used the Henmon-Nelson assessment in conjunction with unassociated tests measuring typing, handwriting, and orthographic abilities (Reeds 21). This test would endure a revision in 1950 due to its perfunctory manual (Reeds 5). Soon, the Wonderlic Personnel test, a notorious intelligence assessment with an easy strip key scoring system, would braid educational aptitude and dubious psychological research into a vocational assessment which came in five varieties (Reeds 9-10). The best questions were founded on multiple-choice and true-or-false formats because each had only one possible response to weigh by preference (Farmer 16).

Employers started distributing employment assessments because hiring managers could not commingle with a germinating urban population like in the good old days and it had become more expensive to train employees; they could no longer hire pretty faces with adequate experience (Reeds 1). The exception was department store managers who recruited tantalizing male salesclerks to entice a female demographic of mothers and wives— and their children (Luskey 201). As disturbing as this practice was, it did help the salesclerks galvanize their masculine pride and stimulate social skills training (Luskey 202-203). I will postulate that it reduced turnover rates as well. Keep in mind that this all started with letting women participate in public debates and it escalated into a theatrical dreamscape for womankind. Some firms utilized the Kuder Preference Test, such as the Statler Hotel in Boston, Massachusetts, for example, which paired it with Wonderlic tests A and D and the O’Rourke Clerical Aptitude Test (Prowell 31). The hotel only administered these assessments if their employee in question had been working consistently for them for a span of three months or after a “second interview with the personnel director” (Prowell 90). Prowell suggested that short mental aptitude tests exposed the profitability of further training because “an employee will soon lose interest in his job if the tasks are either above or below his mental ability” (Prowell 115).

Within a decade of that momentous occasion, managers throughout the United States began to acquaint themselves with the job application blank. Hot off the presses in 1918 came a request from the Tuscaloosa County Self-Preservation Loyalty League for all registered civilians to participate in “war work” and submit a job application to their local employment agency, mainly for unskilled labor (Tuscaloosa News 4). The shipping board in Austin, Texas did not receive their shipment of application blanks to be distributed for shipbuilders in 1918 (Sevier et al 8). In 1919, secretary of the Portland, Maine board of accountancy Ralph G. Brewster was accepting applications for the certified public accountant examination (Brewster 20). In 1919, the Allentown, Pennsylvania State Hospital advertised applications for becoming a registered school nurse (Tyler 13). In 1923 in Mount Vernon, New York, Benjamin Cullen dispersed application forms for aspiring nurses and firefighter chiefs (Cullen 12). Last but not least for this closer look at the past, in 1927, one full decade after Meeker’s propositions for a standard job application form, the Indianapolis Railway Institute sought aspiring firemen, brakemen, and sleeping car porters (Susong 18). In fact, the workforce became so attached to these documents that factories were hiring hundreds of thousands of laborers into a sphere of corporate advancement, “narrowly defined job classifications, firm-specific training, and implicit and/or explicit seniority arrangements” under the pretense of changing the contingencies of their employment on a whim (Rogers 13-14). The typical applicant faced questions pertaining to these restrictions only after establishing a good first impression with the employer (Fletcher 20); these days, it is more of a cold open than a warm handshake.

I will cautiously presume that during the early 1900s white women were the ones carrying the most change, in the form of currency spent at department stores and suffragism in the war for voting rights. This status confronts the whopping 12,333 (50.45%) of Baltimore Black women who worked as servants, 7,716 (31.56%) who worked as laundresses, and 746 (3.05%) who worked as waitresses (Foner 24-28) leading into 1923. During the second Industrial Revolution, women did take the stage as beacons of corporate progress, so these Baltimore data may have belabored the peonage of womankind, as the report from Chicago highlighted the steps being taken to be more inclusive of Black women, who were “slowly but surely beating down color prejudice and taking her place as a factor in industry,” in Chicago at least (Foner 22). The data exalted the vindication of Black women in Chicago: one of the numerous department stores was looking to hire 75 Black women as merchandise inspectors, and a spring cushion factory had hired 350 women since 1919, one of them working as a welfare secretary, and planned to fire 250 more in the coming years (Foner 22). Factories in Atlanta typically hired women within their family unit while paying them their individual wages (Frankel 31). It was therefore appropriate for retailers to drop the “shopgirl” trope and transition into a total “resocialization” of their female salesclerks wherein their new aim was for women to upsell as if they were perpetually communicating with other women or their domestic clients, conversations during which they would suggest to one another matching accessories, stellar deals, the proper dress size, and suchlike (Benson 16). Women were objects of the retailer’s delight as they were ripe for producing profit.

Black men, on the other hand, typically worked for factories in the northern states, as well as mills and stockyards (Foner 15). Overall, according to a survey conducted in Baltimore in 1924, distributed to employers of more than 39,210 Black men, 4,879 Black men (12.44%) worked as unskilled building laborers, 3,100 (7.9%) worked as stevedores, 1,876 (4.78%) worked as servants, 1,831 (4.67%) worked as chauffeurs, and 1,712 (4.37%) worked as draymen (Foner 24-28). In the steam railroad industry, 136,065 Black men worked in unskilled labor positions (Foner 36). The competition for unskilled labor conditions during these fairly prosperous years due to the broadened access to application blanks would ultimately lower productivity costs and provide firms with an excuse to breach the Fair Labor Standards Act of 1938 (Rogers 19). Concurrently, Black men fronted the fertilizer, waterfront manual labor, and construction industries (Foner 30-31) as long as white supremacists did not oust them in favor of anti-union white men as in Atlanta, Georgia in the early 1900s (Frankel 11).

Black men therefore did not find the same success as Black women, or all women for that matter, but their best bet for employment was to ironically fill in for these picketing white men (Foner 20-21). Up to 4% of the population of Atlanta, though, was African immigrants classified as members of the “professional and business class” who did not have to deal with fickle white employers who only hired Black men strictly during the terms of their contract; this marked another win for Black women who rose up as the breadwinners for these men (Frankel 11). Factory managers were the most likely to recommend Black employees for their “satisfactory results” (Foner 35). On the cusp of the 1930s, Black retail workers would outrank Black direct sales agents, with one Chicago department store boasting a 60% Black workforce (Foner 61). Of all fields, that of industrial chemistry experienced an influx of Black workers (Foner 69). Unfortunately, Black women still dominated “agriculture, domestic and personal service, dressmaking, tobacco factores, and teaching” rather than a more “gainful” occupation such as railroad construction or working at a turpentine mill (Foner 62). This was during the experimental era of personal selling, through which trained salesclerks would “increase both the size and number of sales transactions” by “appeal[ing] to the customers’ vanities of class and sex” (Benson 3). In Detroit, Black men represented automobile factories (Foner 59).

From the 1920s until the start of World War II in 1939, scholars and employers further explored the concept of the weighted job application blank in order to predict and eliminate incompatible applicants. By 1935, civilian researchers determined up to twenty scientifically sound methods of surveying candidates (Farmer 1). Military researchers ran the extra mile to deduce that multiple-choice assessments on scored biographical data forms successfully predicted which applicants would quit or make it through training in comparison to the top ten psychological and physical assessments available at the time. Foregoing modern employment assessments which interweave with application information, several studies showed that assessments as adapted to specific positions most successfully predicted employability (Sells 46-47). Combinations of assessments, such as either the Stanford-Binet intelligence test or Kornhauser’s Test of Personality Characteristics plus analyses of applicant experiences, proved to be more effective than interviews alone (Sells 47). Managers that utilize multiple selection devices, such as interviews, assessments, and weighted job application blanks are more informed than those who depend on impressions or random selection. While this conclusion may seem obvious at first, the reasoning is a bit more complicated.

The process of determining this conclusion through selection utility was first proposed in 1939 and has been conceptualized into two different models. The BCG Model of selection utility, named after H.E. Brogden, G.C. Gleser, and L.J. Cronbach respectively, solved the problem of managers using difficult, haphazard models for assessing and selecting applicants. Even with the option of application blanks, hiring managers could not reliably measure how well new employees could perform based on factors of their service and their profitability. Later, high schoolers were subject to generalized assessments as well starting in the 1930s to predict their vocational abilities; tests included the Differential Aptitude Tests, the Guilford-Zimmerman Aptitude Survey, the USES General Aptitude Battery, the Roeder General Aptitude Profile, and the Wechsler-Bellevue Intelligence Scale (Stuit 28-29). Ostensibly, the cavalry wanted to pin down where and how citizens were going to contribute to society once they graduated from their new required schooling so as to nurture the logistical flow of the American workforce during the Great Depression. Graduation from at least high school was a determining factor of employability for several industries (Malm 241).

In factories and department stores nationwide, employees suffered. The economic downturn had oppressed minor immigrant employees of both mills and retailers into a state of homelessness and starvation, raising understandable concern for the wellbeing of children throughout the United States (Opler 14). The Fair Labor Standards Act of 1938 monitored this inculcation to combat subpar “detrimental” labor conditions (Rogers 9). Union organizers relished in the words of the act which required corporations to “proceed with collective bargaining” (Opler 92) while it established the minimum wage and the forty-hour work week in order to eliminate unfair means of competition, instituted a standard for bookkeeping so that corporations could properly report records, and, most importantly of all, “prohibited most forms of child labor” (Rogers 10). Victims of these callous conditions had the option to sue their bosses and make them regret extorting personality for the sake of productivity (Rogers 10). However, when capitalism warranted a will, contractors in manufacturing plants found a way to let their subcontractors hire laborers, including minors of course, to avoid liability (Rogers 11). The act lifted the responsibility of manual labor from a vast majority of minors and allowed for the construction of more playgrounds, baseball fields, and public swimming pools (Kilgore 179). It could have lessened incidents of shoplifting among minors as well (Opler 23). It also disturbed the influx of hiring documentation as it defined an employee essentially as a person who is allowed to work, in the sense that firms were held “liable for foreseeable and preventable violations by their contractors, even if they declined to exercise control over the physical details of their work” (Rogers 22). In other words, a company was not allowed to hire children simply because they gave them a particular task under a subcontractor or give immigrants a disproportionate amount of labor.

In Texas in 1910, Mexican immigrants had been hired to maintain railroad tracks, tend cattle, and fill in for absent Black slaves on cotton plantations (Kilgore 53). To mitigate these inhumane loopholes, courts decided to clarify the Act’s definition of employment to conclude that an employer had to allow their employees to perform reasonable tasks under agreeable supervision with logical pay rates and properly notify them of any changes to their status (Rogers 23-24). This would redress circumstances such as those in Texas when textile mills paid white workers less, despite the fact that they represented the majority of mill employees in Texas, due to competition from mills in neighboring states, although Texas had passed a labor law in 1900 which had tackled unfair labor practices for women and children (Kilgore 54). Unsurprisingly, these mills rarely, if ever, hired Mexican immigrants, relegating them instead to the great outdoors, unlike mills in the northern states who prioritized immigrants over industry (Kilgore 11).

Rampant racism during the Great Depression inspired the Works Progress Association (WPA), an anti-discrimination league consisting strictly of employees selected from “relief rolls to assure that benefits went to the neediest,” to omit discriminatory information such as racial and religious affiliations from their own, presently required personnel diversity records, obscuring their mission (MacLaury 66-68). One Texas town, McKinney, used population as an excuse for prolonging child labor practices, as they tried to level with the “southern pattern” of mill employment (Kilgore 82). The town must have blissfully ignored the progress that had been made by a previous child labor law, the Keating-Owen Act, enforced by President William Taft in 1912, whose agents tracked down child labor culprits via employment records; this was not too different of a circumstance (Frankel 120). The agents would strain to verify the ages of the children working at factories when distributing employment certificates because birth certificates were very rare (Frankel 120-121). The demand for textiles lept in Texas, calling for an additional 250 adult employees by 1933, pulled from the fields to the conveyor belts (Kilgore 160-161). By 1938, this mill would surge in employees after a lighting inspection forced them to double down on lights, increase wages, and hire 200 more employees (Kilgore 178-179). Finally, after the 1938 act passed into law, hours and wages increased mainly for white men, as Black men were perceived as less assiduous than white men (Kilgore 180).

Government agencies, like the Department of Labor, the Department of Agriculture, and the Department of the Treasury, all retained skewed diversity records (MacLaury 122). In spite of this, the CSC had rescinded their injudicious photograph requirement for job applications in 1933, resulting in a twofold increase in Black government employees and an astounding threefold increase in Black employees nationwide, and their discretion to interview three candidates at once and hire or reject them without any explanation (MacLaury 59). The fight against discrimination continued in 1943, when President Franklin Roosevelt formed a committee to screen complaints concerning any discernible discrimination resulting from the interpretation of job application information (MacLaury 96-97). Under the Fair Employment Practice Committee (FEPC), Roosevelt allowed his examiners to settle discrimination suits by having the employer delete some or all racial requirements from the application form (MacLaury 97).


The White-Collar Boom:

Perhaps gender was another discriminatory factor, because by the end of World War II, the majority of the women who had commingled with white men in factories were relegated to white-collar positions (Keogh 79). They were coerced into laundry service and chauffeur work via deceptive advertising techniques (Keogh 97). However, by 1945, 16% of the workforce was not white (Keogh 72). Speaking of pallor, the white-collar boom of the 1950s sequestered hundreds of thousands of employees into offices and professional work spaces. The office environment acclimated more to the paper blank that would continue to alter the landscape of recruitment.

The descriptor of “white-collar” originated in the 1850s when the division between employer and employee began to dissipate; managers found themselves doing the same amount of work, if not more work, than their inferiors while accessorizing with a white collar “to complement the respectable, dark, broadcloth coat emblematic of bourgeois responsibility” (Luskey 173). Astonishingly, one of the first byproducts of this movement was remote computer programming. Electronic computers have been around since at least the 1950s, thanks once again to the military, who always seem to get the good stuff first (Homberg et al 77). The first programmers worked from home in the 1950s to write their formulas and codes into letters which they would mail to the data center where someone would punch the data into cards for the machine (Homberg et al 78). I wonder if these pioneering remote employees were able to grandfather themselves out of President Harry Truman’s 1951 committee, the President’s Committee on Government Contract Compliance (PCGCC), which enforced anti-discrimination policies in American workplaces; inspectors alerted employers with problems of diversity expecting them to cooperate, and a failure to do so triggered “possible sanctions or penalties” pending a deeper investigation into the labor practices of that employer (MacLaury 126-127).

In the 1950s, the new trend was qualification based on mental stability. Psychological assessments and weighted application blanks were all the rage (Malm 231). Retail employees were blue-collar workers considering their public workspace. However, their managers were white-collar because they sat at their desks thumbing through job application forms, interviewing references, and performing background checks— except when they were not doing so as with some department stores in Massachusetts and Indiana, according to a hands-on study conducted by Leon Gregory Nagler of the application and hiring processes of various retailers (Nagler 10). The diversity among these application blanks was due to a fluctuating economy and “the advance of personnel research;” companies referred to a study that had been conducted by the American Management Association in 1949 which identified eleven checkpoints for an effective application blank (Nagler 63). In summary, the questionnaire considered if certain items were relevant to the position, necessary for identifying unqualified applicants, and provided reliable results (Nagler 63-64). Many questions would appear conversational to the applicant; others were planted to either satisfy the “pet idea” or hunch of an employer or test the reaction of applicants to “foolish,” offhand items to gauge whether the office environment would be suitable for them (Fletcher 20). Applicants ought not to have skipped these items out of fear of answering them incorrectly because all that mattered were the facts being requested of them, such as “whether you are saving money, own your home, carry life insurance, or enjoy happy married life” (Fletcher 20).

Through requesting biodata on how a person has behaved in a similar situation in the past, while knowing which factors differ in the current environment for that same situation, the manager can compare the past data to the ideal data, or behavioral and occupational expectations, of the firm and reveal whether the applicant is suitable for the job (Farmer 13). It turns out that these types of unorthodox questions tended to predict criminal activity such as theft and credit card fraud better than the standard questions could, if they even could at all (Beall 19). The survey also asked employers to assess the trustworthiness of the applicant in answering these questions and to ensure compliance with non-discriminatory employment laws (Nagler 63-64). The ability for certain unorthodox items to predict transgressions better than others gave them more weight, or influence, toward defining an applicant’s potential decorum (Beall 19). Credit investigations could uncover embarrassing or uncouth incidents that applicants tried to hide from the blank (Fletcher 20). Information gathered via standardized questionnaires and assessments could act as a quantifiable litmus test for human behavior because, since people typically continue to act in the present and future how they had been conditioned to behave in their past, one could feasibly predict how people would behave on the job while preventing interviewer bias and painting a fair picture of the applicant (Farmer 7). Similarly, with biodata, people’s futures are generally influenced simultaneously by both their past and present decisions, making it even easier for application data to predict how well an applicant will assimilate to the environment (Farmer 14).

Department stores who did hire “bonding agencies” to investigate applicants omitted such revealing questions from their application blanks (Nagler 68). Many employers referenced physical documents confirming the applicant’s employment history, their length of unemployment, and their “current value on the labor market, as indicated by his past earnings, and his current earnings expectations” (Malm 236). Prior employment experience constituted another important section, because if a manager could not construct your life story, then you were not discerning enough to pique their interest; this did likely pair with a background check from your old managers (Fletcher 20).

Back office workers at Gordon’s Department Store in Gary, Indiana in 1954, however, did not monitor new hire trial periods or follow through with bonding, instead relinquishing that responsibility to a personnel manager who had to ensure that employees did not mess around; they also did not verify prior employment experience for new hires (Nagler 10). Their female receptionist, however, did all of the work of grading applications for completion and filing unfit applications away for three months (Nagler 12; 13-14). To be fair, their application blank acted more like a meager interview flashcard because it contained only the most pertinent information (Nagler 14). They cared more about getting eyes on their advertising than hands on their application blanks anyway (Nagler 42-43). Many department stores standing during the 1950s did not provide applications for a return delivery until after the interview rather than beforehand (Nagler 47). While the receptionist did not admit rowdy teenagers or individuals with physical disabilities, they did not type a checkbox onto the application form to screen out minors, decades after presidential child labor laws had been enacted (Nagler 65).

This was not the case at Teal’s Department Store, founded in an unascertained locality, where only the individuals with disabilities were welcome (Nagler 23). Their applications did ask about physical weaknesses, but that was redundant since that detail was revealed during the interview process (Nagler 66). Teal’s was also lax about citizenship status (Nagler 65-66). Perhaps Gordon’s thought that assessments including mathematical equations, credit rating problems, and problem-solving questions would sieve minors (Nagler 118-120). After all, primarily a small percentage of large firms were using personnel assessments to measure mental ability, clerical aptitude, and more as opposed to smaller businesses and retailers (Malm 242). Teal’s used the Henmon-Nelson Test of Mental Ability, one of the four Wonderlic assessments, and numerical and vocabulary tests for both temporary and regular applicants (Nagler 24; 29). Specific qualities were necessary to work at the store possessing more than four thousand employees run by managers screening through upwards of five thousand applicants per year (Nagler 21). Neat, poised, social, intelligent, dependable applicants were most likely to receive a callback from Teal’s (Nagler 28).

Robbin’s Department Store in Boston offered a stark application form that did not seem to have all of the necessary information on it (Nagler 67). They may have sided with the skeptical crew of managers who still hired applicants based on— in modern terminology— vibes alone, especially a “pleasing” demeanor (Malm 240). When they did consider more biodata, the prior employment experience, physical appearance, health, and mental aptitude of the applicant proved to be most important for consideration; in a jarring twist, other factors such as age, race, and sex were pitted against stereotypical caricatures (Malm 235).

The factor of “mental aptitude” ascertained from biodata is not to be conflated with the personality traits predicted by psychological assessments; hard biodata compares the applicant’s quantifiable responses to real life situations and predicts an applicant’s skills and knowledge rather than their personality (Farmer 8-9). Applicants over the age of forty-five would pitifully experience more difficulties in their job hunts and had better odds of being promoted from within (Malm 247). It was a bonus to include extraneous details pertaining to your hobbies, studies, and religious affiliations (Fletcher 20). Nagler, however, thought that lengthy application blanks took too much responsibility away from the interview process and only ought to have included the details about prior employment, “sex, marital status, number of dependents, any special skills, position desired, amount of education,” and of course the name of the applicant, excluding references which could be skewed in the applicant’s favor (Nagler 95-96). Nagler would disagree with psychologist and author Leona E. Tyler, who in 1959 advocated for the interpolation of biodata in the prediction of human behavior over time; through the 1960s, data such as education, previous work experience, and marital status were increasingly considered as means for propagating this end (Farmer 2).

It had become fairly commonplace by 1954 for the supervisors, department heads, or owners of small companies to act alone in the recruitment process and, for large firms such as factories, for personnel departments to handle different parts of the process such as annuities, benefits, training, and recruitment (Malm 232). This white-collar system mostly eclipsed the nepotistic practices that had lingered over factories and retailers for decades. One dicey factor remained, the one that had inspired Meeker to develop the job application blank as we know it today: the turnover rate. A 1955 study correlated quitting with job dissatisfaction, although firing had also been taken into account (Hom et al 531). Some importer clerks assigned to trade in various cities resorted to networking with other salesmen in order to make a good first impression and talk business (Luskey 198). Then, a 1956 study found that employees remained loyal for longer periods given that their trainers or interviewers provided them with a “realistic job preview” such as a brochure or office tour (Hom et al 532). Like “vocational interest inventories,” hard biodata are sufficient at predicting how well applicants will perform in certain specific positions, which is clearly a vital factor for hiring managers to consider (Farmer 8).

Surprise: remote programmers were mostly women! Since they were still confined to domestic work by social conventions and wanted out, women seized the opportunity to ride this new wave of programming innovation, including Elsie Shutt who started Computations, Inc in Massachusetts in 1957 and Stephanie Shirley founded what would become F International in 1962 in the United Kingdom (Homberg et al 83). These vanguard geniuses ushered in the era of the “working professional” because male media moguls were stunned by their moxie; unfortunately, even with women stepping from the kitchen to the keyboard, men still viewed this monumental paradigm shift as an opportunity for women to improve their skills in raising children and cooking dinner (Homberg et al 84). German industrial manufacturing company Siemens contrarily called for the humanization of women willing to profit from their freedom (Homberg et al 88). They did not have to tolerate “authoritarian” or “inconsiderate” managers as did employees who grew weary of their environment over time (Hom et al 531). By 1955, corporations within the industries of finance, retail, insurance, and real estate reported that women were more likely to be hired than men; in all but the retail firms, women could even be promoted to supervisory positions (Malm 244).

Akin to the transition which occurred as the COVID-19 pandemic waned, this visionary remote system would go hybrid in the 1970s when these programmers could travel to satellite branches of these data centers, mostly to engage in outsourced “routine desk work” (Homberg et al 78). A United States fresh out of the countercultural movement welcomed with open arms a “decentralised post-industrial society” building on a rekindled antagonism toward the ascendant corporate hierarchy present in the 1970s (Homberg et al 87). Even in the 1960s, employees were becoming too independent from their superiors, implying a descent into anarchy; whether reported as societal threats or heroes, remote workers nevertheless watched their wallets grow (Homberg et al 88). Programmers were not the only futurists: the armed forces used “spatially autonomous data centres” for their tasks until the hulking hunks of metal shrank into “microcomputers” in the 1970s for some businesses (Homberg et al 77). Commoners would not encounter this kind of technology until the 1990s (Homberg et al 78). Imagine, if you will, the ascendant status of female computer programmers, supervisors, and CEOs in such an anarchic society.

Anyway, in 1961, President John Kennedy’s agency, the President’s Committee on Government Contracts, stepped in to monitor diversity compliance among contractors (MacLaury 165). Many unions, however, refused to cooperate with accurate diversity data for Kennedy’s other committee, the President’s Committee on Equal Employment Opportunity, one of the first to specify equal employment laws that would become standard in the modern labor landscape (MacLaury 209). Kennedy was the first United States president to enforce equal opportunity employment “on the basis of merit alone” (MacLaury 226). These diversity data would herald a digitized database run by programmers. Experimentation in digital communication in the 1960s presaged personal computers hitting the public scene in the 1980s; businesses shortly capitalized on the infiltration of accessible advanced digital technology to once again flip the industrial model on its head (Homberg et al 77). In 1968, one study upheld a “consistency model” which used these biodata as a bridge between job requirements and applicant backgrounds; another model proposed that biodata could be used to delineate more categories of applicants (Farmer 3). A study from 1971 found that outside of the noise of biodata, applicants hired based on good references were hooked and willing to participate in further job orientations and trials for their superiors (Hom et al 532).


Into the Modern Age:

Soon, biodata, references, and the application process as a whole would be wrapped in a pretty bow with the onslaught of telecommunications spearheaded by the research of Jack Nilles in 1973 which outlined how remote telecommunication could reduce pollution and road traffic in the name of logistical efficiency (Homberg et al 78). As technology aged, so did the humble job application blank. Employers and vendors added more items such as “distance from work, type of residence, acquaintances in the company, and membership in organizations” (Beall 18). Those pioneering programmers were treated like regular employees in the 1980s while personal computers began to pierce the public bubble (Homberg et al 77; 80).

This influx of wired correspondence overwhelmed employers, forcing them to persevere with perfecting the weighted job application blank. A 1981 study found that employees quit when they knew that they could get hired easily elsewhere, but sometimes, perhaps injudiciously, they sought new jobs only after leaving their current undesirable position; rarely did they freely apply elsewhere while still employed there (Hom et al 534). Corporations had to develop a foolproof strategy for developing fair recruitment procedures to avoid indebting themselves with unnecessary employee wages by reversing the path from the end result (Dawson et al 120). For example, if tenure was a desired benchmark for applicant success, then the criterion of tenure would need to be evaluated based on its relevance, predictability, quantifiability, reliability for the company, representation of the current issue at hand, individuality amidst other pressing issues, and lack of bias which would skew the applicant’s chances (Dawson et al 120) so that they could attract the right employees with the right goals.

Another predictable factor is crime, including credit fraud, according to two studies conducted in the late 1980s by the Navy Personnel Research and Development Center on 52,000 different weighted application blank formats (Beall 21-22). Crime is clearly undesirable in candidates, so the criterion would be the absence of a criminal record; outside of performing a background check, this strategy would judge this criterion against the aforementioned variables. Then, employers would need to divide their current employees into desirable ones with the longest tenures and undesirable ones with the shortest tenures, since the earliest evidence of the issue had been discovered prior to selecting the questions of the application blank to be weighed, or in the case of crime the employees would be sorted from most substantial to most negligible criminal past (Dawson et al 121). These could only include discriminatory questions that could properly classify applicants on the final draft, such as those inquiring about race, sex, religion, et cetera, as long as the employer filed a Bona Fide Occupational Qualification form to acquire approval (Dawson et al 121). Nowadays, the inclusion of any discriminatory questions such as those requesting marital status, as well as age, disabilities, complexion, crime, and financial background, would render a job application form void and— more importantly— illegal (Beall 19). If any United States applications or assessments screen out individuals with disabilities whether on purpose or accidentally, intentionally inquire applicants about their medical and disability history, or do not imply or provide a reasonable accommodation in violation of the American Disabilities Act (ADA), their distributors face tens of thousands of dollars in legal fines and then further charges of discrimination (EEOC 2022) While the fear of inviting a discrimination lawsuit has hindered modern firms from packing weighted job application forms with certain hard-hitting biodata questions, they can still use them as long as they are careful about what they ask for (Beall 23). Then, questions were further organized by class or frequency in relation to the criterion, in this case tenure, and then fed through a mathematical formula which assigned weights to each question (Dawson et al 121). In order to predict applicant behavior, biodata must intermingle with a series of questions that “in some way appear to be connected to the criterion of interest, with the ultimate goal of establishing a developmental linkage;” the indirect approach asks the applicant questions which demonstrate various behaviors that do not collectively reflect the way that the employer wants the applicant to behave, unlike the direct approach which achieves the opposite effect (Farmer 10). A heterogeneous mix of both approaches best ensures a fair variety of outcomes and therefore weights for the assessment, which in turn are compared to the scores of an ideal applicant, assuming that the questions in question adequately represent the manager’s views and can be objectively calculated (Farmer 10). These weights were finally cross-validated in comparison to a control group of employees that would reveal a reasonable passing grade for applicants, who preferably should not have been able to detect that their application was actively grading them in the first place (Dawson et al 122). Therefore, applicants with a reasonable criminal history and a desire to work for a long time with the company join the shortlist for new hires. Referring to numerical qualifiers rather than leaning on subjective sentiments was a vastly superior way to visualize and erase costly criteria; it is definitely easier to look at a line graph than to parse through a manager’s journal for substantial critiques of their labor force (Dawson et al 122). Popular convention dictates that the weights of the questions should be arranged by how well they can predict the applicant’s employability for a given position: broader questions establish more connections to variegated applicant personalities and aptitudes than do specific ones (Farmer 11).

The job application was down to a science at this point. Since the 1980s stood in the forefront of another technological revolution, the logical next step was to plug these blanks into the mainframe of the Internet using an applicant tracking system (ATS). No longer did employers want to treat applicants like the humans that they are; they wanted to oil their machines and stimulate the flow of commerce. ATS software simply files away resumes until human— or perhaps AI— eyes glaze over them. Luckily, in 1982, ResTrac was founded as the earliest ATS software, seven years before British scientist Sir Tim Berners-Lee launched the World Wide Web in March 1989 (CERN), to work in conjunction with an optical character recognition (OCR) software, or “the first mainstream client server recruiting centric system that managed resumes, requisitions, and applicant flow” (Becker et al). OCR digitizes handwritten documents into images that a computer can read.

Humans still had to upload the physical materials to sort and scan them into sister system Resumix, which organized resumes by keyword and sent them over to Restrac for parsing (Becker et al). The postcard printed by the system and sent to the owner of the resume and application blank as a confirmation of receipt was the cherry on top of this cold machination (Becker et al). ATS software would reach none other than the Central Intelligence Agency (CIA) in 1985 as part of their ongoing paperwork reduction objective (Gates 6). Some of the excess reports had to do with “OCR document and publication procurement;” crucially, the CIA automated applicant tracking using their neoteric OCR software to tremendously free up time for more important matters (Gates 6). Since they probably receive thousands of applications a week because everybody wants to be a hotshot CIA agent, they spent their money wisely (Augustine S4). In the late 1990s, ATS software packages, including EzHire, Greentree, and Personic Software among others, could cost employers up to $200,000, including licensing fees, solely affordable by Fortune 500 executives (Wheeler); imagine, if you will, what ATS software cost in the late 1980s. During their mission, the team managed to automate and consolidate documents related to 56 administrative procedures, many of which they even eliminated altogether (Gates 6).

Knowledge of ATS software would leak into the public sphere as employers recruited people with software experience (Los Angeles Times 46). Magazines instructed loyal readers on how to construct home offices starting in the 1990s, which would experience dozens of imminent job boards and recruitment websites (Homberg et al 90). Personal computers began to alter the family dynamic. In the 1980s, scholars counted on “neo-familialism,” a philosophy in which families would bond over their usage of home technology, to forever restitch the social fabric (Homberg et al 84). Children observed this paradigm shift firsthand at home, but a learning model from the 1970s would pipe hands-on learning and interaction into the classroom. In 1974, American educational theorist David Kolb culminated experiential education observations into a model which trained students to reflect upon their real life experiences and create connections (Ruiz 27).

As a substitute educator, I have visited dozens of classrooms with posters depicting daily jobs and responsibilities for students. From preschool to college, students can learn how to safely manage their social environments. The fairly recent notion of classroom chores acquaints children with this skill early and illustrates the “psychological constructs that naturally occur during active learning” (Ruiz 27). Resultant collaborative learning curricula inspired students to think beyond the classic question of why they were learning things to dissect the reasons behind the solutions and strategies that they studied in class (Tinzmann et al 30). These included planning their own lessons and creating teams for participating in learning tasks (Tinzmann et al 13). Students endure conflict resolution troubles in the course of their classroom chores in anticipation of posturing to employers soft skills typical of a stellar resume, including supervision, management, networking, time-management, et cetera (Smart 108). Deliberaton of contentious subject matter such as that which bears arguing in the classroom fosters “responsibility for the protection of learning and safety rights in the classroom” (Riaz 13). One other source of conflict is student groups. Groupwork is a tutorial means of acclimating students to the notion of classroom chores. For instance, one student may be responsible for asking for information while another is tasked with clarifying difficult passages, or “summarizing, encouraging, and relieving tension” within their group (Tinzmann et al 11). Behavioral students who do not receive enough supervisory care may have difficulties finding a job, as employers prioritize “functional and social skills rather than academic skills” (Ruiz 38).

Schools following a Montessori curriculum, however, can expect their students to run the world independently with minimal supervision. These institutions can provide the most straightforward examples of classroom chores. One Montessori school, West Coast Charter School located somewhere in California, mandates that students clean the school, tend to plants and the farm that they apparently have, prepare the cafeteria for lunch, and engage in “routine chores and classroom maintenance” (LaRue 97). Autonomy breeds confidence, which is a common hurdle for students transitioning from grade school to college or the workforce (Ruiz 40), especially special education students who need more support to build up that passion for realizing their passions. While most of the time this journey imposes part-time jobs to ensure a steady initial income stream, Montessori schools employ the service-learning side of experiential education stemming from educational reforms in the 1990s (Ruiz 27) that aimed to guide disadvantaged students in the right direction. At Urban Private School, a high school in some bustling American city, the Montessori program escalates the tasks into domestic chores as well as planning entire school activities within their student groups (LaRue 97).

At a standard preschool in Mississippi, students role play as landscapers, farmers, firefighters, and even cashiers in order to develop a sense of civic responsibility (Riaz 56-57). Students with specific roles help teachers with the process of cleaning the classroom (Riaz 50-51). Role playing “groomed the children into independent citizens capable of doing their jobs independently,” which most definitely prepared them for the occupational world in which they need to discover their own purpose and responsibilities in order to contribute to society (Riaz 60). Work zones set up in a classroom, as was done for students enrolled in a post-graduate alternative for secondary students (PASS) program, may demystify employment skills for special education students and hesitant students alike (Ruiz 42). Overall, through engaging in classroom chores, students procure experience in multiple facets of coveted opportunities such as management, supervision, organization, acquisition, bookkeeping, and so forth and reasonably hone those skills as they grow into adolescence and learn more about the job market (Tinzmann et al 11).

Internships escalate these skills by making prospective employees and college students do all the work of finding and applying for jobs that will benefit them (Otto 85). One can hope that students do not have to learn these skills the hard way. In the 1980s and 1990s, for example, South African school children found themselves in a perilous economic state in which it was difficult to create jobs; in response, some local businesses stepped up to let students stimulate the economy by offering them mentorships and temporary positions during school breaks (Qoto 88). Hungary may have sensed from an 8,300 mile distance the depravity faced by these students in 1992 and founded the Junior Achievement Hungary Foundation which encouraged Hungarian students to run a fake business during the school year; they learned responsibility, ingenuity, and most importantly hierarchical classification, with students acting as managing directors, sales managers, account managers, accountants, and more (Qoto 31). If this program did not prepare students for the occupational world, then I have no idea what could have done so. Their program ran against the grain of the Hungarian secondary school system which extolled a standardized curriculum (Qoto 31). Standardized testing lowers student expectations for success in the real world by only supporting memory and not nurturing critical thinking and problem-solving skills (LaRue 27; 53).

In a similar vein, the competition of internship hunting can discourage students from exercising their full potential, but the prospect of refining their experiences and networking to acquire occupational advantages in the larger job market usually keeps them afloat (Otto 85; 183). Special education students can fulfill roles in campus jobs which can give them an extra push toward social independence and confidence before going for an internship opportunity (Ruiz 106); then, they can participate in post-secondary transition programs that walk them from a controlled high school resource classroom to an experience that greatly mimics the workforce which unfortunately many may not end up entering otherwise (Ruiz 123).

Modern business is international however, and students may battle an inability to translate their locally derived skill sets into a global context (Otto 198). The world shrinks every day, so students need to be able to substantiate their value for a company that works with foreign clients; the pride gained from completing classroom chores cannot compare to the professionalism projected from successfully interacting with clients of all backgrounds. In Uganda, for example, students participating in work-based learning curricula were able to learn soft skills and compare their aspirations to the actual responsibilities of a given occupation (Otto 202). However, their teachers read from Eurocentric textbooks and therefore did not properly communicate the ubiquitous mobilization of these skills, depriving their students of their full potential to succeed (Otto 198). One would think that since most Ugandans are raised into a “white-collar job mentality” they would view technical and vocational training programs as a way to gain real world skills; on the contrary, Ugandan stakeholders and policymakers heed Aristotle’s philosophy that “the highest human activity is the cultivation of the mind” and evaluate theoretical rote learning over technical skill (Otto 73-74). They treat this program as an extracurricular summer activity; therefore, students do not grasp the connections between learning and working (Otto 74).

They need to follow Luis Moll’s footsteps. Anthropologist and educator Luis Moll discovered in the 1980s that Mexican-American families survived “debilitating circumstances such as poverty and discrimination” by assigning each house within a neighborhood a particular specialization, such as vehicle repair, medication, and more, to create “funds of knowledge” resources for the neighborhood; for their children, this translated into experiential chores which harmonized them with their communities (Tinzmann et al 21). Moll used this observation to develop a laboratory in United States schools where children could participate in community building activities involving construction, research, presentations, and more (Tinzmann et al 22). This, in my opinion, subliminally equipped those students for finding a purpose in the world, which unconditionally starts with finding a job; the same goes for Ugandan students if they are not earning a degree from a university, trade school, or technical school. Subsequently, students would find themselves facing the humble job application form or educational application form at some point in the future with a colorful backpack of skills.

Soon, people would be able to reduce their paper waste, reuse their information, and recycle it to an online platform. I find it intriguing that by attaching cover letters to our resumes which we submit with job applications, we mimic a practice which originated during ancient times when oral anecdotes passed on traditions and customs to future generations and “admonished mindful consideration of audience, purpose, and context in determining the content, arrangement, and style of a speech” (Edwards 5). As aforementioned, Sir Tim Berners-Lee switched on the server for the World Wide Web in March 1989 (CERN). His goal was to elucidate the Internet and “merge the evolving technologies of computers, data networks, and hypertext into a powerful and easy to use global information system” (CERN) as a side project from his nuclear physics studies. In 1994, research dictated that employees could follow the path of screening alternative job opportunities based upon their personal values and comparing their results to their current job in order to make a decision on quitting (Hom et al 536). Well, this was their moment.

Propelling the democratization of the application process (Giang), human resources manager Jeff Taylor uploaded The Monster Board to the Internet in order to gather and post help wanted ads from national newspapers. Taylor first advertised this leading job board of the Internet in the Boston Sunday Globe in 1995 as a resource for the hottest job opportunities on the Internet— albeit the only ones available through a computer screen (Boston Sunday Globe). Later that year, Taylor promoted The Monster Board as an empowering purlieu for college students featuring “a virtual world of cybersuites and webtasia for career opportunity and mosh pit surfing” (Bank B16). By September, it boasted “nearly three thousand job listings spread over a base of five hundred employers” and a ROAR section, where “writers, musicians, and artists [could] find info on developing their careers” for the chance to win a “free T-shirt” (Pawlak F3). Christy Spilka, vice president of Internet Collaborative Information Management Systems (ICIMS), has stated that “a strong talent attraction strategy and a great employer brand, combined with an engaging and authentic careers site and an easy application process, is critical” (Maurer). Taylor achieved that with his charm and urbanity. At this time in the sphere’s development, most of the competition “offer[ed] more in the way of career advice and resume tips than they [did] in actual job openings” while others “[padded] the number of job listings they claim[ed] by including the thousands of postings on the Internet’s Usenet newsgroups, the discussion forums that often include outdated or incorrect listings” (Bank B16). The Monster Board was a free service that let people know for the first time the identities of the top hiring companies whose executive recruiters were perusing for their job applications on the Resume On-Ramp (Ottawa Citizen).

Moving the recruitment process online allowed employers to save tens of thousands of dollars a year on marketing and recruitment services; one service called Bernard Hodes Advertising, as an example, charged “$10,000 to $20,000 to create and manage a multiple-page Web site for an entire year” while in comparison, it cost up to $18,000 to post a classified “half-page ad in the Sunday edition of the San Jose Mercury News” (Bank B16). Also in 1995, software developer Robert McGovern founded NetStart Incorporated, a human resources software company that earned him an investment of $2 million USD, just enough to create CareerBuilder as a standalone website, and programmer Craig Newmark posted his list of classified advertisements to Craigslist, where employers could encourage users to apply to the jobs that they would post (Hur). Because of this explosion in convenience, the magnitude of job applicants doubled and tripled in the early 2000s (Sundberg [a]). The Monster Board gained traction in 1996 when it went public and Taylor issued a press release about it (Hur). In the same year, Richard Johnson, a Silicon Valley entrepreneur, sent HotJobs to the Internet as a venue where employers could post niche openings in accounting, sales, and eventually all domains (Hur). This correlates with a 2002 study that found that employees looking to quit their current positions would painstakingly research labor market trends to find job prospects and see where they would be of the most relevance, otherwise accepting unsolicited counteroffers from humans, or ostensibly job boards such as Monster or CareerBuilder (Hom et al 537).

There cannot be commercial development without the possibility of vertical integration and merging. In 1998, Webhire acquired legendary ATS software Restrac, before getting bought out by Kinexa in 2005 and IBM in 2012 respectively (Becker et al). In 1999, Online Career Center merged with The Monster Board and the site rebranded itself as the Monster that we know and love today (Hur). By 2005, Rony Kahan and Paul Forster brought Indeed to the Internet as a scraper site which aggregated job postings from the other nascent job boards (Hur). As the race for the title of supreme job board commenced, programmers would edge forward with ATS software. The ability for recruiters and hiring managers to share the same candidate data organized in one convenient database reduced discrimination bias and softened their workload, so I guarantee that the large investment was worth it (Kulkarni 7). Yes, recruiters still have to probe your application and attached resume, even well into the 2020s, most job applications are still screened by human beings using human eyeballs, after artificial intelligence scans for keywords via ATS software (Sundberg [a]). Sometimes they enumerate how many of each keyword a resume or application form contains in order to justify the skill level of the applicant (Augustine S4).

Enterprise recruitment systems began eliminating the need for recruiting departments to generate reports as requested by managers since they accounted for how many employees companies could afford, managed the job requisition process, generated web pages to be posted onto the early job boards, linked “to benefits, payroll, and career development systems to provide automated employment applications and conduct background screening of potential candidates,” and tracked applicants via the Internet or Intranet (Wheeler). Human employers still create and advertise their open job posting, collect resumes by hand or through job boards sites to which they subscribe, screen and interview applicants, and make offers as needed; the ATS integrates all of these phases with productivity tools such as email or messaging services (Kulkarni 7). This reliance on ATS software can damage their conversion rates and recruiting metrics as their variety and cumbersome interfaces may scare away applicants and therefore tarnish their bottom line (Maurer). Programmers gained more rights, and larger paychecks, in the early 2000s when managers began devising ways to soak their businesses into the programmers’ computers (Homberg et al 80).

This percolation, while promoting the best of intentions, would introduce overzealous monitoring and exhausting standards for profitability (Homberg et al 85). For example, augmented reality apps such as ZenCV, a recruitment app developed by Andrés Farfán Granda of Milan, Italy, can be risky; he installed a virtual chatbot to help applicants write a cover letter, analyze it, and, like an ATS, decide whether the applicant can progress further through the process, basing the assessment on “multiple dimensions, specially those parameters that will be important for the company and the position” such as “how fit the person is for the position, emotional coefficient, intellectual coefficient, psychological tests, soft skills, logic, et cetera” (Farfán 45).

Farfán must tread lightly with artificial intelligence. The United States Equal Opportunity Commission (EEOC) recognizes chatbots and personality assessments as software capable of incorporating algorithmic decision-making (EEOC 2022). Even though oversights are uncommon due to complex qualification algorithms, employers still need to be cautious (Giang). Failure to issue “an alternative testing format or a more accurate assessment of the applicant’s or employee’s skills as a reasonable accommodation” or prevent applicants with disabilities from getting lower scores on assessments due to, for instance, long gaps in employment history, can result in fines (EEOC 2022). Questions or analyses may not favor verbiage that may suppress the preferences or choices of individuals with disabilities; a business is held liable even when they do not create the assessments or applications by themselves, opting to purchase prefabricated forms from books or CDs with selections of forms (Inc.). Modern students with disabilities graduating from career transition programs may be fond of programs like Google Bard and ChatGPT and may not think twice about their probability of encountering discriminatory employment practices through an artificially intelligent filter, so it is of the utmost importance for corporations to publicize user-friendly interfaces, alternative testing or application formats, and simplified instructions (EEOC 2022). Artificial intelligence does not understand that just because a human may not be able to stand for a long time does not mean that they cannot work at a retailer or factory without reasonable accommodations (EEOC 2022).

LOOK OUT FOR PART TWO!


Works Cited


“A Short History of the Web.” CERN (Conseil Européen Pour La Recherche Nucléaire), 2023, home.cern/science/computing/birth-web/short-history-web.


Baker, Suzanne Helen. “The Coming of Conscription in Britain.” North Texas State University, University of Northern Texas Digital Library, 1972, pp. iii–245.


Beall, Ge Ge Ellenburg. “Validity of the Weighted Application Blank Across Four Job Criteria: A Meta-Analysis.” Applied Human Resource Management Research, vol. 2, no. 1, 1991, pp. 18–26.


Becker, Tom, et al. “The History of Recruiting: 1900 - Present.” Edited by Adela Schoolderman and Gerry Crispin, ERE.net, ERE Media, 15 Nov. 2021, https://www.ere.net/wp-content/uploads/sites/2/2021/11/History-of-Recruiting-1900-present-day-11-15-21.pdf.


Benson, Susan Porter. “The Cinderella of Occupations: Managing the Work of Department Store Saleswomen, 1900-1940.” The Business History Review, vol. 55, no. 1, 1981, pp. 1–25., https://doi.org/10.2307/3114439.


Bicknell, Alexandra. “Atlantic Abolition in the Borderlands: the Interesting Narrative of Mahommah Gardo Baquaqua.” Western Michigan University, Scholarworks at WMU, 2020, pp. 1–35. https://scholarworks.wmich.edu/honors_theses/3331.


Blake, Holly Jacklyn. “Marie Howland— 19th-Century Leader for Women's Economic Independence.” The American Journal of Economics and Sociology, vol. 74, no. 5, Nov. 2015, pp. 878–1192., https://www.jstor.org/stable/43818675.


Blanke, David. “Sowing the American Dream: Consumer Culture in the Rural Middle West, 1865-1900 (Volume I - Chapters 1 to 6).” Loyola University Chicago, Loyola eCommons, 1996, pp. ii–429. https://core.ac.uk/works/44870752.


Bloomfield, Meyer. “The New Profession of Handling Men.” The Annals of the American Academy of Political and Social Science, vol. 61, Sept. 1915, pp. 121–126., https://www.jstor.org/stable/1013006.


Bragdon, Joseph. “Trend Toward a Five-Hour Day.” Current History (1916-1940), vol. 33, no. 6, Mar. 1931, pp. 854–858., https://www.jstor.org/stable/45333648.


Chandler Jr, Alfred D. “Industrial Revolutions and Institutional Arrangements.” Bulletin of the American Academy of Arts and Sciences, vol. 33, no. 8, May 1980, pp. 33–50., https://doi.org/10.2307/3823248. (a)


Chandler Jr, Alfred D. “The Emergence of Managerial Capitalism.” The Business History Review, vol. 58, no. 4, 1984, pp. 473–503., https://doi.org/10.2307/3114162. (b)


Clapp, Frank Leslie, and Robert V Young. “A Self-Marking English Form Test.” The Elementary English Review, vol. 5, no. 10, Dec. 1928, pp. 304–306., https://www.jstor.org/stable/41381294.


Cody, Sherwin. How to Deal with Human Nature in Business: A Practical Book on Doing Business by Correspondence, Advertising, and Salesmanship. 2nd ed., University of Michigan School of English, 1915.


Cuenin, Paul M. “A Statistical and Theoretical Treatment of Hours of Work in the United States.” Boston University, Boston University Institutional Repository, 1949, pp. Iii-111. https://open.bu.edu/handle/2144/16644.


Dawson, Donald B, et al. “Developing and Using Weighted Application Blanks: An Experiential Exercise.” Developments in Business Simulation and Experiential Learning, vol. 11, 1984, pp. 120–123.


“Delivery: Monday through Saturday since 1863.” USPS, United States Postal Service, June 2009, https://about.usps.com/who/profile/history/pdf/delivery-monday-through-saturday.pdf.

Domosh, Mona. Invented Cities: The Creation of Landscape in Nineteenth-Century New York & Boston. Yale University Press, 1996.


Edwards, Verlane Dee. “Rhetoric of Professional Correspondence: Origins of Contemporary Practice.” Iowa State University, Iowa State University Digital Repository, n.d., pp. ii-128.

Farfán Granda, Andrés Esteban. “Internalization Process for Artificial Intelligence Recruitment Startups.” Polytechnic University of Milan, 2021, pp. 1–86. http://hdl.handle.net/10589/175096.


Foner, Philip S, and Ronald L Lewis, editors. “PART I ECONOMIC CONDITION OF THE BLACK WORKER.” The Black Worker, Volume 6: The Era of Post-War Prosperity and the Great Depression, 1920-1936, vol. 6, no. 1, 1981, pp. 1–136., https://doi.org/10.2307/j.ctvn5tvxv.5.


Frankel, Noralee, and Nancy S Dye, editors. Gender, Class, Race, and Reform in the Progressive Era. The University Press of Kentucky, 1991. https://core.ac.uk/works/71219968.


Gates, Robert M. “The Directorate of Digital Innovation.” 11 Dec. 1985. https://www.cia.gov/readingroom/docs/CIA-RDP95M00249R000801140004-4.pdf.


Gardner, Edward Hall. Effective Business Letters: Their Requirements and Preparation, with Specific Directions for the Various Types of Letters Commonly Used in Business. Ronald Press, 1916.


Giang, Vivian. “Why New Hiring Algorithms Are More Efficient — Even If They Filter Out Qualified Candidates.” Business Insider, 25 Oct. 2013, www.businessinsider.com/why-its-ok-that-employers-filter-out-qualified-candidates-2013-10.


Gilbert, Gustave Mark. Nuremberg Diary. Farrar, Straus and Giroux, 1947.


Gleason, Leigh. “Canvassed and Delivered: Direct Selling at Keystone View Company, 1898-1910.” De Montfort University, De Montfort University Open Research Archive, 2018, pp. 2–277. https://core.ac.uk/works/78608352.


Glancey, Jonathan. “A History of the Department Store.” Edited by Christian Blauvelt, BBC Culture, British Broadcasting Corporation, 26 Mar. 2015, https://www.bbc.com/culture/bespoke/story/20150326-a-history-of-the-department-store/index.html.


Hitchcock, Jane Elizabeth. “Annual Report of the New York State Board of Nurse Examiners.” The American Journal of Nursing, vol. 5, no. 3, Dec. 1904, pp. 171–180, https://doi.org/10.2307/3402094.


Hom, Peter W, et al. “One Hundred Years of Employee Turnover Theory and Research.” Journal of Applied Psychology, vol. 102, no. 3, 2017, pp. 530–545., https://doi.org/10.1037/apl0000103.


Homberg, Michael, et al. “From ‘Home Work’ to ‘Home Office Work’?: Perpetuating Discourses and Use Patterns of Tele(Home) Work since the 1970s: Historical and Comparative Social Perspectives.” Work Organisation, Labour & Globalisation, vol. 17, no. 1, 2023, pp. 74–116, https://www.jstor.org/stable/48724543.


Huey, Katharine. “Problems Arising and Methods Used in Interviewing and Selecting Employees.” The Annals of the American Academy of Political and Social Science, vol. 65, no. Personnel and Employment Problems in Industrial Management, May 1916, pp. 208–218., https://www.jstor.org/stable/1013574.


Hur, Johnson. “History of the Online Job Search.” BeBusinessed, BeBusinessed, https://bebusinessed.com/history/history-online-job-search/.


Ingraham, D.S. “Letter of Rev. D.S. Ingraham [American Missionary].” Received by The Boston Liberator, Boston, Massachusetts, 17 Jan. 1839, Kingston, Jamaica.


Innis, Harold A. “The Newspaper in Economic Development.” The Journal of Economic History, vol. 2, no. Supplement: The Tasks of Economic History, Dec. 1942, pp. 1–33, https://www.jstor.org/stable/2112934.


John Jr., Richard R. “Private Mail Delivery in the United States during the Nineteenth Century: A Sketch.” Business and Economic History, vol. 15, 1986, pp. 135–147., https://www.jstor.org/stable/23702866.


Keogh, Tim. “Suburbs in Black and White: Race, Jobs & Poverty in Twentieth-Century Long Island .” City University of New York, CUNY Graduate Center, 2016, pp. I-396. https://core.ac.uk/works/63335317.


Key, Francis Scott. “Reply to Rev. Dr. Tappan.” Received by Dr. Benjamin Tappan, Windsor, Vermont, 8 Oct. 1838, Washington, D.C., Washington, D.C.


Kilgore, Deborah Katheryn. “Interweaving History: The Texas Textile Mill and McKinney, Texas, 1903-1968.” University of North Texas, University of North Texas Digital Library, 2009, pp. Ii–264. https://core.ac.uk/works/135000056.


Klassen, Henry C. “T. C. Power & Bro.: The Rise of a Small Western Department Store, 1870-1902.” The Business History Review, vol. 66, no. 4, winter 1992, pp. 671–722, https://doi.org/10.2307/3116844.


Klein, Christopher. “The Spies Who Launched America’s Industrial Revolution.” History, A&E Television Networks, LLC, 10 Jan. 2019, https://www.history.com/news/industrial-revolution-spies-europe.


Kopytek, Bruce Allen. “Arnold, Constable & Co. New York City, New York.” The Department Store Museum, Blogger, Aug. 2011, http://www.thedepartmentstoremuseum.org/2011/08/arnold-constable-co-new-york-city-new.html.


Kravitz, Bennett. “A Certain Doubt: The Lost Voice of Deborah Samson in Revolutionary America.” Studies in Popular Culture, vol. 22, no. 2, Oct. 1999, pp. 47–60, https://www.jstor.org/stable/41970372.


Kulkarni, Swatee B, and Xiangdong Che. “Intelligent Software Tools for Recruiting.” Journal of International Technology and Information Management, vol. 28, no. 2, 1 July 2019, pp. 2–16, https://doi.org/10.58729/1941-6679.1398.


Laird, Pamela Walker. Advertising Progress: American Business and the Rise of Consumer Marketing. Johns Hopkins University Press, 2019. https://doi.org/10.1353/book.72714.


LaRue, Wendy J. “Empowering Adolescents: A Multiple Case Study of U.S. Montessori High Schools.” Walden University, Walden University ScholarWorks, 2010, pp. Iii–194. https://scholarworks.waldenu.edu/dissertations/731/.


Lauwaert, Maaike. “Part I: New Children, Different Toys.” The Place of Play: Toys and Digital Cultures, 2009, pp. 21–44., https://www.jstor.org/stable/j.ctt46mx23.4.


Lebergott, Stanley. “Labor Force and Employment, 1800–1960.” National Bureau of Economic Research, Edited by Dorothy S Brady, 1966, pp. 117–204. Output, Employment, and Productivity in the United States after 1800, https://www.nber.org/system/files/chapters/c1567/c1567.pdf.


Luskey, Brian P. “Jumping Counters in White Collars: Manliness, Respectability, and Work in the Antebellum City.” Journal of the Early Republic, vol. 26, no. 2, summer 2006, pp. 173–219, https://www.jstor.org/stable/30043407.


Lynch, Deidre Shauna. “Counter Publics: Shopping and Women's Sociability.” Romantic Sociability: Social Networks and Literary Culture in Britain, 1770-1840, edited by Gillian Russell and Clara Tuite, Cambridge University Press, Cambridge, United Kingdom, 2006, pp. 1–280.


Mabie, Hamilton Wright, and Marshall Huntington Bright. Mabie’s Popular History of the United States. J.C. Winston, 1897.


Mackillop, Andrew. “Military Recruiting in the Scottish Highlands 1739-1815: The Political, Social and Economic Context.” University of Glasgow, OpenGrey Repository, 1995, pp. i–396.


MacLaury, Judson. “To Advance Their Opportunities: Federal Policies Toward African American Workers from World War I to the Civil Rights Act of 1964.” University of Tennessee, Newfound Press, 2008, pp. Vii–298. https://core.ac.uk/works/75166552.


Malm, F. Theodore. “Hiring Procedures and Selection Standards in the San Francisco Bay Area.” Industrial & Labor Relations Review, vol. 8, no. 2, Jan. 1955, pp. 231–252., https://doi.org/10.2307/2519388.


Martin, Ann Smart. “Buying into the World of Goods: Eighteenth-Century Consumerism and the Retail Trade from London to the Virginia Frontier.” College of William & Mary, W&M Scholarworks, 1993, pp. Iii–402. https://dx.doi.org/doi:10.21220/s2-q2mr-b119. (a)


Martin, Ann Smart. “Makers, Buyers, and Users: Consumerism as a Material Culture Framework.” Winterthur Portfolio, vol. 28, no. 2/3, autumn 1993, pp. 141–157, https://www.jstor.org/stable/1181525. (b)


Maurer, Roy. “Most People—92%—Never Finish Online Job Applications.” SHRM, SHRM, 16 Feb. 2022, https://www.shrm.org/resourcesandtools/hr-topics/talent-acquisition/pages/most-people-never-finish-online-job-applications.aspx.


McClaskie, Maud. “Proceedings of the Tenth Annual Convention of the Nurses' Associated Alumnae of the United States.” 1907, pp. 791–911.


Meeker, Royal. “Proceedings of the Employment Managers' Conference, Philadelphia, PA, April 2 and 3, 1917.” United States Department of Labor, 1917, pp. 1–210. https://fraser.stlouisfed.org/files/docs/publications/bls/bls_0227_1917.pdf?utm_source=direct_download.


Moore, Frank R. “English in the Grammar Grades.” Journal of Education, 27 Mar. 1902, 203, 206. https://www.jstor.org/stable/44063145.


Nagler, Leon Gregory. “Pre-Employment Procedures in Department Stores: A Survey and Analysis.” Boston University, Boston University OpenBU, 1954, pp. 1–133. https://hdl.handle.net/2144/8335.


New England Historical Society. “At Half His Age, Sybil Ludington Rode Twice as Far as Paul Revere: But She Did It For the Same Reason.” New England Historical Society, updated 2023, newenglandhistoricalsociety.com/half-age-sybil-ludington-rode-twice-far-paul-revere-same-reason/.


Opler, Daniel J. “For All White-Collar Workers: The Possibilities of Radicalism in New York City’s Department Store Unions, 1934–1953.” The Ohio State University, The Ohio State University Press, 2007, pp. V–270. https://core.ac.uk/works/51797116.


Osborne, Nicholas. “Little Capitalists: The Social Economy of Saving in the United States, 1816-1914.” Columbia University in the City of New York, Columbia University Libraries Academic Commons, 2014, pp. I–406. https://doi.org/10.7916/D8VM49F2.


Otto, Susan. “Work-Based Learning in the Ugandan Secondary School Curriculum.” University of Toronto, TSpace of the University of Toronto, 2022, pp. Ii–276. https://hdl.handle.net/1807/126386.


Pedulla, David S. “How Race and Unemployment Shape Labor Market Opportunities: Additive, Amplified, or Muted Effects?” Social Forces, vol. 96, no. 4, June 2018, pp. 1477–1506., https://www.jstor.org/stable/26563304.


Popken, Randall. “The Pedagogical Dissemination of a Genre: The Resume in American Business Discourse Textbooks, 1914-1939.” Journal of Advanced Composition, vol. 19, no. 1, 1999, pp. 91–116., https://www.jstor.org/stable/20866224.


Priest, George L. “The History of the Postal Monopoly in the United States.” The Journal of Law & Economics, vol. 18, no. 1, Apr. 1975, pp. 33–80., https://www.jstor.org/stable/725246.


Prowell, Elizabeth Mae. “Personnel Policies of Boston Hotels.” University of New Hampshire, Boston University OpenBU, 1947, pp. Ii-155. https://core.ac.uk/works/46227166.


Qoto, Nomonde Monica. “Assessing Entrepreneurship Education Programmes in Secondary Schools.” Nelson Mandela Metropolitan University, Department of Academic Administration, 2012, pp. Ii–137. https://core.ac.uk/works/9567637.


Quimby, Ian M.G. “The Doolittle Engravings of the Battle of Lexington and Concord.” Winterthur Portfolio, vol. 4, 1968, pp. 83–108, https://www.jstor.org/stable/1180488.


Reeds, Anne Bernie. “A Study of Certain Hiring Practices of Five Selected Agencies in Decatur, Illinois.” Eastern Illinois University, The Keep, 1966, pp. iii-34.


Resseguie, Harry E. “Alexander Turney Stewart and the Development of the Department Store, 1823-1876.” The Business History Review, vol. 39, no. 3, 1965, pp. 301–322., https://doi.org/10.2307/3112143.


Rezneck, Samuel. “The Social History of an American Depression, 1837-1843.” The American Historical Review, vol. 40, no. 4, July 1935, pp. 662–687., https://doi.org/10.2307/1842418.


Riaz, Muhammad. “Analysis of Classroom Practices That Preschool Teachers Use to Promote Civic Efficacy.” Mississippi State University, Mississippi State University Institutional Repository, 2017, pp. Iii–95. https://core.ac.uk/works/127154140.


Rogers, Brishen. “Toward Third-Party Liability for Wage Theft.” Berkeley Journal of Employment and Labor Law, vol. 31, no. 1, Mar. 2010, pp. 1–64, https://www.jstor.org/stable/26377726.


Romanski, Fred J. “The Fast Mail: A History of the U.S. Railway Mail Service.” National Archives, The U.S. National Archives and Records Administration, 8 Dec. 2022, https://www.archives.gov/publications/prologue/2005/fall/fast-mail-1.html.


Rounds, Charles Ralph. “Fifth Annual Meeting.” National Council of Teachers of English, Proceedings of the Fifth Annual Meeting, Chicago, Illinois, November 25-27, 1915, 1915, pp. 33–78. https://www.jstor.org/stable/801821.


Ruggeri, Giuseppe. Work and Leisure in America. FriesenPress, 2021.


Ruiz, Robin E. “Understanding a Transitional Educational Program for Students with Autism and Intellectual Disabilities.” Walden University, Walden University ScholarWorks, 2020, pp. i–165. https://core.ac.uk/works/96630636.


Sanders, Steven C. “The Evolution of Eighteenth-Century Upholders in London.” Oxford Brookes University, RADAR Institutional Repository of Oxford Brookes University, 2021, pp. I–246. https://core.ac.uk/works/18795766.


Smart, Lyndsey. “Teacher Experiences in Creating an Invitational Learning Environment in a Diverse Classroom.” University of Pretoria, UPSpace at the University of Pretoria, 2019, pp. I–148. https://core.ac.uk/works/123647523.


Sells, Saul B. “Measurement and Prediction of Special Abilities.” Review of Educational Research, vol. 14, no. 1: Psychological Tests and Their Uses, Feb. 1944, pp. 38–54., https://doi.org/10.2307/1168162.


Shaw, Steven J. “Colonial Newspaper Advertising: A Step toward Freedom of the Press.” The Business History Review, vol. 33, no. 3, 1959, pp. 409–420., https://www.jstor.org/stable/3111955.


Simpson, Liana. “Why Employers Need Both a Resume and a Job Application.” Sequoia Personnel Services, 1 Feb. 2015, https://sequoiapersonnel.com/2015/02/why-employers-need-both-a-resumes-and-a-job-application/.


Stuit, Dewey B. “Construction and Educational Significance of Aptitude Tests.” Review of Educational Research, vol. 20, no. 1: Educational and Psychological Testing, Feb. 1950, pp. 27–37., https://doi.org/10.2307/1168652.


Sundberg, Jörgen. “Before LinkedIn, How Did Our Ancestors Find Jobs?” Undercover Recruiter, Undercover Recruiter, 2023, https://theundercoverrecruiter.com/infographic-linkedin-how-exactly-did-our-ancestors-find-jobs/. (a)


Sundberg, Jörgen. “The History of Job Applications: From Faxing to Social Media.” Undercover Recruiter, Undercover Recruiter, 2023, https://theundercoverrecruiter.com/infographic-the-history-applying-jobs/. (b)


Tabb, Charles Jordan. “The History of the Bankruptcy Laws in the United States.” American Bankruptcy Institute Law Review, vol. 3, no. 5, 1995, pp. 5–51, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2316255.


Teachers' National Bureau. “Teachers' National Bureau, Organized 1871.” The Educationist, vol. 2, no. 4, July 1874, p. 63. https://ia801905.us.archive.org/20/items/sim_educationist_1874-07_2_4/sim_educationist_1874-07_2_4.pdf.


Thale, Mary. “The Case of the British Inquisition: Money and Women in Mid-Eighteenth-Century London Debating Societies.” Albion: A Quarterly Journal Concerned with British Studies, vol. 31, no. 1, spring 1999, pp. 31–48, https://doi.org/10.2307/4052815.


“The Creation, 1832-1864.” Smithsonian National Postal Museum, Smithsonian Institution, 2023, https://postalmuseum.si.edu/research-articles/the-railway-mail-service-history-of-the-service/the-creation-1832-1864.


“The Employment Situation.” Bureau of Labor Statistics, United States Department of Labor, 7 May 2023, https://www.bls.gov/news.release/pdf/empsit.pdf.


Thiers, Marie-Joseph-Louis-Adolphe. History of the French Revolution. Translated by Thomas W Redhead, Archibald Fullarton and Co., 1845.


Tinzmann, Margaret Baker, et al. The Collaborative Classroom: Reconnecting Teachers and Learners, 3rd ed., North Central Regional Education Lab and Public Broadcasting Service, 1990, pp. 2–78. Restructuring to Promote Learning in America’s Schools. https://files.eric.ed.gov/fulltext/ED327931.pdf.


United States, Congress, Navy Personnel Research, Studies, and Technology Division, and William L Farmer. A Brief Review of Biodata History, Research, and Applications, Bureau of Naval Personnel, 2006, pp. V-51.


United States Postal Service. We Deliver: The Story of the U.S. Postal Service, The United States Postal Service, Washington D.C., Washington, D.C., 1980. https://files.eric.ed.gov/fulltext/ED281820.pdf.


Varga, Lukáš. Business Correspondence. Slezská Univerzita, 2019.


Vida, István Kornél. “The ‘Great Moon Hoax’ of 1835.” Hungarian Journal of English and American Studies, vol. 18, no. 1/2, 2012, pp. 431–441., https://www.jstor.org/stable/43488485.


Wheeler, Kevin. “Enterprise Recruiting: What Is It All about?” ERE.net, ERE Media, 17 Feb. 1999, https://www.ere.net/enterprise-recruitingwhat-is-it-all-about/.


White, John H, editor. “Histories of the Individual Firms.” Railroad History, no. 197, Winter 2007, pp. 24–85, https://www.jstor.org/stable/43524479.


Newspapers


Augustine, Amanda. “What Is an ATS? How to Write a Resume to Beat the Bots.” The Morning Call, 30 June 2019, p. S4.


Bank, David. “Computer May Link You to New Job.” Asbury Park Sunday Press, 18 June 1995, p. B16.


Brewster, Ralph G. “Dates Changed: Accountancy Examinations Here Nov. 13-14.” Portland Evening Express and Daily Advertiser, 16 Oct. 1919, p. 20.


Craig, Neville Burgoyne, editor. “Facts for Laboring Men.” The Pittsburg Gazette, 3 Apr. 1840, p. 2.


Cullen, Benjamin J. “Wanted.” The Daily Argus, 24 Jan. 1923, p. 12.


Fletcher, William L. “How to Get the Job You Want: LXVIII - Filling Out Employer’s Blank.” Harrisburg Telegraph, 12 Oct. 1923, p. 20.


Hospital Council of Southern California. “Director, Health Careers Information Center.” The Los Angeles Times [Los Angeles], 6 Jan. 1991, p. 46.


Jewett, Elam R, and Thomas M Foote, editors. “The House of Industry, for the Relief of Poor Females.” Commercial Advertiser and Journal, 11 May 1841, p. 3.


Kennedy, Joyce Lain. “Here Are Some Fresh Tips to Keep Your Resume Alive.” St. Lucie News Tribune, 7 Apr. 2013, p. F3.


Kunzelman, Michael, et al. “Third Accuser Has History of Legal Disputes.” Bismarck Tribune, 2 Oct. 2018, p. A4.


“Last Chapter in Fall of Old Department Store Zone Recorded Last Week by Arnold, Constable.” The Sun, 4 Oct. 1914, p. 57.


Morphew, John. “Advertisements.” The Post Boy, 18 Jan. 1717, p. 2.


Pawlak, Jim. “Surf’s up on the WWW.” Wisconsin State Journal, 10 Sept. 1995, p. 2I.


Sevier, Henry Hulme, et al., editors. “Many Want To Enroll For Shipbuilding; The Mayor Has No Blanks.” The Statesman, 5 Feb. 1918, p. 8.


Susong, Bruce I. “Wanted.” The Cincinnati Post, 16 Mar. 1927, p. 18.


The Monster Board. “We Put A Monster On The Internet: Imagine What We Could Do For Your Career!” Boston Sunday Globe, vol. 247, no. 8, 8 Jan. 1995, p. A46.


The Monster Board. “We Put A Monster On The Internet: Imagine What We Could Do For Your Career!” The Ottawa Citizen, 7 June 1995, p. A9.


“To Wholesale and Retail Dry Goods Merchants.” The Evening Post, 11 Mar. 1825, p. 3.


Tyler, Leslie B. “Student Nurse.” The Scranton Republican, 29 Dec. 1919, p. 13.


“United States Employment Service Opens Offices and Is Ready for Its Work Here.” The Tuscaloosa News, 7 Aug. 1918, p. 4.

13 views0 comments

Recent Posts

See All

Comments


bottom of page