Maya Ray Thurman Hawke (born July 8, 1998)[2] is an American actress and singer-songwriter. She is the daughter of Hollywood actors Ethan Hawke and Uma Thurman. She began her career in modeling, and subsequently made her screen debut as Jo March in the 2017 BBC adaptation of Little Women.
Hawke was born on July 8, 1998,[2] in New York City, the older of two children born to actors Ethan Hawke and Uma Thurman.[3] Her parents met on the set of Gattaca (1997), married in May 1998,[4] and divorced in 2005.[5] Hawke has a brother, Levon.[6][7] She also has two half-sisters by her father's second wife, Ryan Shawhughes.[8][9] She has another half-sister from her mother's ex-fiancé, financierArpad Busson.[10]
Hawke has dyslexia,[15] which resulted in her changing schools frequently during her primary education before she was finally enrolled at Saint Ann's School, a private school in Brooklyn, New York, that emphasizes artistic creativity and does not grade work. The artistic environment eventually led her to acting.[7] Hawke also took part in summer studies at the Royal Academy of Dramatic Art in London and the Stella Adler Studio of Acting in New York.[16] She studied toward a BFA in acting at the Juilliard School for one year before dropping out after accepting her role in Little Women.[7]
Like both her mother and grandmother, Hawke modelled for Vogue at the beginning of her career.[3][17] She was also chosen as the face of the British fashion retailer AllSaints' 2016/2017 collection.[7] In 2017, she starred as one of several faces in a video campaign for Calvin Klein's underwear range, directed by Sofia Coppola.[18] In September 2022, Hawke modelled for Calvin Klein's FW22 Underwear campaign.[19]
Hawke was Sofia Coppola's choice to play the title role of The Little Mermaid in Universal Pictures's live-action adaptation. However, the producers preferred actress Chloë Grace Moretz. This and other conflicts ultimately led to Coppola leaving the project.[20] Moretz eventually dropped out as well.[21]
In 2020, Hawke starred in Gia Coppola's sophomore film, Mainstream, alongside Andrew Garfield.[27][28] In the same year, she guest starred in the fifth episode of the miniseries The Good Lord Bird, which stars her father, Ethan Hawke. She stars as Annie Brown, the daughter of her father's character.[29] In June, she appeared in Italian Studies, written and directed by Adam Leon and co-starring Vanessa Kirby. It premiered in the Tribeca Film Festival and later released on January 14, 2022.[30] Later that month, she appeared as Heather in the Netflix horror film Fear Street Part One: 1994.[31] In 2021, she also starred in a spin-off podcast series based on her Stranger Things character, Rebel Robin: Surviving Hawkins.[32] She starred in another scripted podcast series, The Playboy Interview, in which she plays Helen Gurley Brown.[33]
Hawke has said that folk music has influenced her music career including artists like Leonard Cohen, Patti Smith and Joni Mitchell.[41][38] She takes inspiration from "great lyricists" like Cohen and Bob Dylan as, according to her, "music is the best way to communicate poetry".[42]
In August 2019, Hawke released her first two singles, "To Love a Boy" and "Stay Open".[43] The songs were written and recorded by Hawke and Grammy Award-winning singer-songwriter Jesse Harris.[44] Hawke performed a series of headlining gigs around New York City in early 2020, her first ever solo live performances as a musician.[45][46] In each of these shows, Hawke was supported by Benjamin Lazar Davis, Tōth, Will Graefe and Nick Cianci respectively.[47][46] On March 18, 2020, Hawke released the first single "By Myself" and announced her debut album titled Blush[48] amid the 2020 Black Lives Matter protests. Hawke wrote, "I feel like this is not a time for self-promotion. It is a time for activation, education and self-examination."[49] The album's second single, "Coverage", was released on April 22, 2020,[50] before its music video directed by Maya's father Ethan Hawke was released on the 28th.[51] Initially set for release on June 19, 2020, Blush was delayed to August 21, 2020.[52][46] To support the release of Blush, Hawke appeared as a musical guest for the first time in her career on The Today Show in late August 2020.[53]
On June 29, 2022, alongside the release of the single "Thérèse", Hawke announced her second album Moss, which was released on September 23, 2022.[54]
^"Uma Thurman Daughter's Name Revealed". People. October 17, 2012. Archived from the original on July 18, 2013. Retrieved July 17, 2013. 'I would like to announce Uma and Arki's daughter's name for the first time officially: Rosalind Arusha Arkadina Altalune Florence Thurman-Busson, better known to family and friends as Luna,' the actress's rep Gabrielle Kachman tells People exclusively.
^Carr, David (January 10, 2013). "In His Comfort Zone". The New York Times. Archived from the original on March 21, 2013. Retrieved March 22, 2013.
Ivanka Trump speaks on the G20 Osaka Summit Recorded June 28, 2019
Ivana Marie "Ivanka" Trump (/ɪˈvɑːŋkə/; born October 30, 1981) is an American businesswoman. She is the second child of Donald Trump, the president of the United States, and his first wife, Ivana. Trump was a senior advisor in her father's first administration (2017–2021), and also the director of the Office of Economic Initiatives and Entrepreneurship.
Trump converted to Judaism prior to marrying Jared Kushner, a real estate developer, in 2009. The couple have three children. Prior to her political career, she was an executive vice president of her family-owned Trump Organization and also a boardroom judge on her father's television show, The Apprentice. She also had a fashion lifestyle brand called under her own name that consisted of apparel, footwear, handbags, jewelry, and fragrance; Trump shut down the company in July 2018.
In January 2017, Trump became an unofficial advisor in her father's first presidential administration alongside her husband. In March that year, she became an official employee in his administration. While serving in the White House, she continued to retain ownership of businesses, which raised ethics concerns, specifically conflicts of interest.
Ivana Marie Trump was born on October 30, 1981,[1][2] in Manhattan, New York City, as the second child of Donald Trump and his first wife, the Czech-American model Ivana (née Zelníčková).[3][4] For most of her life, she has been nicknamed "Ivanka", a Slavicdiminutive form of her first name Ivana.[5] Trump's parents divorced in 1990 when she was nine years old.[6] She has two full brothers, Donald Jr. and Eric, a half-sister, Tiffany, and a half-brother, Barron.
Trump attended Christ Church and the Chapin School in Manhattan until switching to Choate Rosemary Hall at age 15 in Wallingford, Connecticut.[7] While attending boarding school as a teenager, she also began modeling "on weekends and holidays and absolutely not during the school year", according to her late mother, Ivana.[8] In May 1997, she was featured on the cover of Seventeen which ran a story on "celeb moms & daughters"[9][8] as well as in campaigns for Tommy Hilfiger, Thierry Muglar and Versace.[10][11]
After graduating from Wharton, Trump briefly worked for the real estate firm Forest City Ratner.[17] As executive vice president of development and acquisitions of the Trump Organization, she was responsible for the domestic and global expansion of the company's real estate interests.[18] Trump led the request for proposal (RFP) with the General Services Administration in February 2012, resulting in the final selection of the Trump Organization to develop the historic Old Post Office in Washington, D.C.[19][20] She then oversaw the $200-million conversion of the building into a luxury hotel, which opened in 2016.[21][22] Trump also led the acquisition and redevelopment of the famed Doral Hotel, a 700-room resort, in Miami.[23]
Independent of her family's real estate business, Trump also had her own line of Ivanka Trump fashion items, which included clothes, handbags, shoes, and accessories, available in U.S. and Canadian department stores including Macy's and Hudson's Bay.[24]
Trump in July 2007
Trump formed a partnership with Dynamic Diamond Corp., the company of the diamond vendor Moshe Lax, in 2007 to create Ivanka Trump Fine Jewelry, a line of diamond and gold jewelry sold at her first flagship retail store in Manhattan.[25][26] She also began selling jewelry online through her brand's website, which had a major relaunch in August 2010.[27] Her flagship moved from Madison Avenue to 109 Mercer Street, a larger space in the SoHo district, in November 2011.[28][29] Celebrities were spotted wearing her jewelry including Jennifer Lopez on the cover of Glamour[30] and Rihanna on the cover of W magazine.[31] Her brand was named "Launch of the Year" in 2010 by Footwear News.[32] Trump's brand also went on to win other awards.[33]
Between 2010 and 2018, Trump was also a paid consultant for The Trump Organization. This "non-employee" dual status has been questioned while reviewing taxes and financial disclosures.[34][35] Trump closed down the company and separated herself from her business affiliations at the Trump Organization after she moved to D.C. to serve as a senior advisor to her father in the White House.[36][37] Members of 100 Women in Hedge Funds selected Trump to their board in December 2012.[38]
Trump's flagship store on Mercer Street was reported to be closed in October 2015, and her brand was available at various retail locations including the Trump Tower, Hudson's Bay, and fine-jewelry stores.[39][40] She also had her own line of fashion items available in department stores.[41] Her brand faced criticism for using rabbit fur and was involved in a design infringement lawsuit with Aquazzura Italia SRL, which was later settled.[42][43] Shoes sold under her brand's name were supplied by Chengdu Kameido Shoes and Hangzhou HS Fashion.[44] The Accessories Council Excellence Awards recognized Trump with the Breakthrough Award, presented by designer Carolina Herrera in 2015.[45]
Between March and July 2016, Trump applied for 36 trademarks in China. Seven of them were approved between her father's inauguration in January 2017 and Chinese president Xi Jinping's state visit in the U.S. in April. Three provisional trademarks for handbags, jewelry, and spa services were granted on the day Xi dined with President Trump and his family at Mar-a-Lago.[46] According to a trademark lawyer, the process usually takes 18 to 24 months. A Chinese government spokesman said that "the government handles all trademark applications equally."[47]The Washington Post reported in 2017, "an astounding 258 trademark applications were lodged under variations of Ivanka, Ivanka Trump and similar- sounding Chinese characters between Nov 10 and the end of last year... none appear to have a direct business link with the US president's daughter."[48] In 2017, she partnered with the World Bank to launch a fund for financing female entrepreneurs.[49]
Neiman Marcus and Nordstrom dropped Trump's fashion line due to poor sales in 2017, and other retailers followed.[50][51] Three members of China Labor Watch were arrested in China while investigating a company that produces shoes for American brands, including Trump's brand.[52][53] Trump announced in July 2018 that she shut down her company after deciding to pursue a career in public policy instead of returning to her fashion business.[54][55][56]
In 1997, at the age of 15, Trump co-hosted the Miss Teen USA Pageant, which was partially owned by her father, Donald Trump, from 1996 to 2005.[8] In 2006, she was a guest judge on Project Runway's third season. She reappeared as a guest judge on season 4 of Project Runway All Stars in 2014 and 2015.[63] In 2010, Trump and her husband made a cameo portraying themselves in Season 4 Episode 6 of Gossip Girl.[64]
While Trump was attending boarding school as a teenager, she also began modeling "on weekends and holidays and absolutely not during the school year", according to her mother Ivana Trump.[65] She was featured in advertisements for Tommy Hilfiger,[65]Elle,[66] Vogue,[67] Teen Vogue,[68] Harper's Bazaar,[69] and Thierry Mugler,[70] She also engaged in fashion runway work.[71][70][72][73] In May 1997, she was featured on the cover of Seventeen.[74] Trump has been profiled in many women's fashion magazines, including Vogue,[75]Glamour,[76]Marie Claire,[77] and Elle.[78] She was featured on covers such as Harper's Bazaar,[69]Forbes, Forbes Life,[79]Marie Claire, Golf Digest,[80]Town & Country,[81]Elle Décor,[82]Shape,[83] and Stuff magazine.[84] Trump was featured in Vanity Fair's annual International Best Dressed Hall of Fame List in 2007 and 2008.[85]
Trump introduced her father at the Trump Tower in 2015 as he announced his candidacy for president of the United States.[95][96] She publicly endorsed his presidential campaign and made public appearances supporting and defending him.[97][98][99] However, she admitted mixed feelings about his presidential ambitions, saying in October 2015, "As a citizen, I love what he's doing. As a daughter, it's obviously more complicated."[100]
In January 2016, Trump praised her father in a radio ad that aired in the early voting states of Iowa and New Hampshire.[101][102] She appeared by his side following the results of early voting states in 2016, in particular briefly speaking in South Carolina.[103][104] She was not able to vote in the New York primary in April 2016 because she had missed the October 2015 deadline to change her registration to Republican.[105]
Trump introduced her father in a speech immediately before his own speech at the 2016 Republican National Convention (RNC) in July.[106] The George Harrison song "Here Comes the Sun" was used as her entrance music. She stated, "One of my father's greatest talents is the ability to see the potential in people", and said he would "Make America Great Again".[107] Her speech was well received as portraying Donald Trump "in a warmer-than-usual light", according to The Washington Post.[108] After the speech, viewers commented that the speech was "one of the best – if not the best – of the night", and that Trump is the "greatest asset Donald Trump has".[109] Others said that her speech was the "high point of the convention".[110]
An earlier Post article had questioned whether the policy positions Trump espoused were closer to those of Hillary Clinton than to those of her father.[111] After the speech, the George Harrison estate complained about the use of his song as being offensive to their wishes.[109] The next morning, Ivanka's official Twitter account tweeted, "Shop Ivanka's look from her #RNC speech" with a link to a Macy's page that featured the dress she wore.[112]
After her father's election, Trump wore a bracelet on a 60 Minutes segment with her family, which her company then used in a marketing effort. When asked about it, she pointed to a marketing employee at one of her companies.[113]
In 2017, the artist Richard Prince returned a $36,000 payment he received in 2014 for a work depicting Trump as a protest against her father.[114] A coalition of New York art world figures unhappy with President Trump started an Instagram account called Dear Ivanka to protest against Donald Trump's presidency.[115]
In January 2017, Trump resigned from her position at the Trump Organization.[119] The organization also removed images of Trump and her father from their websites, in accordance with official advice on federal ethics rules.[120]
Trump (fourth from right) attending the signing ceremony for the INSPIRE Women Act on February 28, 2017, in the Oval Office of the White House
After advising her father in an unofficial capacity for the first two months of his administration, Trump was appointed "First Daughter and Advisor to the President,"[121][122] a government employee, on March 29, 2017.[123][124][n 1] She did not take any salary for the position and didn't receive any government health benefits during her four years at the White House.[128][129][130] She also became the head of the newly established Office of Economic Initiatives and Entrepreneurship.[131]
During the early months of her father's administration, some commentators compared her role in the administration to that of Julie Nixon Eisenhower, daughter of President Richard Nixon. Nixon's daughter was one of the most vocal defenders of his administration, and Ivanka Trump defended President Trump and his administration against a myriad of allegations.[132][133] The Washington Post opinion columnist Alyssa Rosenberg wrote, "Both daughters served as important validators for their fathers."[132]
In late April 2017, Trump hired Julie Radford as her chief of staff. Before the end of the month, Trump and Radford had plans to travel with Dina Powell and Hope Hicks to the first W20 women's summit. The W20 was organized by the National Council of German Women's Organizations and the Association of German Women Entrepreneurs[134] as one of the preparatory meetings leading up to the G20 head-of-state summit in July. At the conference, Trump spoke about women's rights. The US media reported that when she praised her father as an advocate for women, some people in the audience hissed and booed in response.[135][136][137] The same month, Trump and then World Bank president Jim Yong Kim authored an op-ed published in the Financial Times on women's economic empowerment,[138] highlighting the critical role that women play in the development of societies and the business case for involving women in the formal economy.[139] Trump advocated for revisions to the Senate tax bill, which proposed doubling the child tax credit.[140]
In July 2017, Trump attended the G20 Summit in Hamburg, Germany, with President Trump and the United States delegation.[141] She launched We-Fi (Women Entrepreneurs Finance Initiative),[142] a United States-led billion-dollar World Bank initiative to advance women's entrepreneurship.[143]
In August 2017, President Trump announced that Ivanka would lead a U.S. delegation to India in the fall in global support of women's entrepreneurship.[144][50][145] In September 2017, Trump delivered an anti-human trafficking speech at the United Nations General Assembly, calling it "the greatest human rights issue of our time".[146] The event was hosted by then British prime minister Theresa May, who personally invited Trump to a patriciate, in collaboration with Great Britain and Ireland.[146]
President Trump, Ivanka and British prime minister Theresa May attend a business roundtable event at St James's Palace in London, June 4, 2019.
Trump supported passage of the Fight Online Sex Trafficking Act (HR 1865), which passed through both houses of Congress and was signed into law by President Donald Trump in 2019.[149]
She and her father attended the 2019 G20 Osaka summit in late June 2019; the French government released a video of her awkwardly inserting herself into a conversation with world leaders, leading to online parodies and memes.[150][151]
After the G20 Summit in Osaka in June 2019, Trump joined President Trump to meet with the North Korean leader Kim Jong-un inside the Korean peninsula's demilitarized zone.[152][153] She described the experience as "surreal".[152]
The Women’s Global Development and Prosperity Initiative (W-GDP) also aimed to increase access to vocational training, capital, and networks for women in the workforce, and remove limits on women’s economic participation. The two houses of Congress introduced bipartisan bills to attempt to codify the initiative.[161]
Trump backed a bill to fund paid family and medical leave for federal employees, which was passed by the Senate in December 2019.[162]
In 2021, a Government Accountability Office audit concluded that Trump's initiative, which spent $265 million a year of taxpayer money on 19 women's empowerment projects, failed to target the money towards projects that related to women's empowerment, and did not measure the impact of the spending.[163]
In January 2020, Trump organized a Human Trafficking Summit at the White House where President Trump signed an executive order expanding his domestic policy office with a new position solely focused on combating human trafficking.[164][154] In June 2020, Trump hosted an event at the White House with the Attorney GeneralWilliam Barr, special advisor Heather C. Fischer, non-profit leaders, and survivors of human trafficking to announce $35 million in grant funding to aid victims of human trafficking.[155]
In July 2020, Trump tweeted a picture of herself with a Goya Foods bean can, endorsing the product. The owner of Goya Foods had days prior praised President Trump, leading to a backlash against the company. Trump's tweet raised ethics concerns, given that Trump was at the time an official adviser in the White House, and employees in public office are not permitted to endorse products.[168]
In September 2020, Trump joined William Barr, the governor of Georgia Brian Kemp, the first lady of Georgia Marty Kemp, and Tim Tebow in Atlanta to announce $100 million in grant funding for human trafficking.[169][170]
While serving in her father's administration, Trump retained ownership of businesses, which drew criticism from government ethics experts who said it created conflicts of interest.[171] It is not possible to determine the exact amount of Trump's outside income while working in her father's administration because she is only required to report the worth of her assets and liabilities in ranges to the Office of Government Ethics.[171] The incomes of Trump and her husband Jared Kushner ranged from $36.2 million to $157 million in 2019, at least $29 million in 2018, and at least $82 million in 2017.[171] In 2019, she earned $3.9 million from her stake in the Trump hotel in Washington, D.C.[171]
Trump refused to address the rally at the Ellipse on January 6, 2021, but was in attendance.[173] During the ensuing riot at the U.S. Capitol, she encouraged her father to make a video on Twitter condemning the riots, acting as an intermediary between besieged U.S. officials and the President. (Donald Trump's video resulted in him being banned from the platform as he said "we love you" to the rioters.)[174]
When asked about her father's 2024 bid for presidency in November 2022, she said,
I love my father very much. This time around I am choosing to prioritize my young children and the private life we are creating as a family. I do not plan to be involved in politics. While I will always love and support my father, going forward I will do so outside the political arena. I am grateful to have had the honor of serving the American people and I will always be proud of many of our Administration's accomplishments.[176]
Trump (second from left in first row) at Seeds of Peace in New York City in 2009
In 2007, Trump donated $1,000 to the presidential campaign of then-Senator Hillary Clinton.[177][178] In 2012, she endorsed Mitt Romney's presidential campaign.[179] In 2013, Trump and her husband hosted a fundraiser for the Democratic politician Cory Booker, and the couple bundled more than $40,000 for Booker's U.S. Senate campaign.[180]
During her father's presidency, Trump transformed from a liberal to an "unapologetically" pro-life, "proud Trump Republican".[181] At the 2016 Republican National Convention, she said of her political views: "Like many of my fellow millennials, I do not consider myself categorically Republican or Democrat."[182] In 2018, Trump changed her New York voter registration from Democratic to Republican.[183][184]
In 2010, Trump cofounded Girl Up with the United Nations Foundation.[185] She was a member of the board of the Donald J. Trump Foundation until it was dissolved after then New York attorney general Barbara Underwood filed a civil lawsuit against the foundation for "persistently illegal conduct" with respect to the foundation's money.[186] In 2014, Trump launched IvankaTrump.com and the Women Who Work campaign which focused on young, modern professional women, aiming to provide a comprehensive lifestyle guide.[187] In November 2019, Trump's father was ordered to pay a $2 million settlement for misusing the foundation for his business and political purposes.[188] The settlements also included mandatory training requirements for herself and her two elder brothers.[189]
Trump also has ties to a number of Jewish charities, including Chai Lifeline, a charity which helps to look after children with cancer.[190] Other charities she supports include United Hatzalah, to which her father, Donald Trump, has reportedly made six-figure donations in the past.[191][192] After she was appointed advisor to the president, Trump donated the unpaid half of the advance payments for her book Women Who Work: Rewriting the Rules for Success to the National Urban League and the Boys and Girls Clubs of America.[88]
Trump collaborated with the nonprofit organizations CityServe, City of Destiny, and Mercy Chefs to supply a million meals to Ukrainian families in March 2022.[193] In December that year, she purchased generators for CityServe's partner churches in Ukraine that were without power.[194] That same year, alongside healthcare industry leaders, she organized five cargo planes of requested medical supplies for Ukraine with the support of the first lady of Poland and the Polish ambassador to the UN.[195][196]
Trump has a close relationship with her father, who has publicly expressed his admiration for her on several occasions.[197][198] Ivanka has likewise praised her father, complimenting his leadership skills and saying he empowers other people.[199]Sarah Ellison, writing for Vanity Fair in 2018, noted that "everyone in the family seems to acknowledge" that Ivanka is her father's "favorite" child.[200] This had been confirmed by the family members themselves in a 2015 interview with Barbara Walters on network television where the siblings were gathered and acknowledged this.[201] According to her late mother, Ivanka speaks French and understands Czech.[202]
In January 2017 it was announced that she and Kushner had made arrangements to establish a family home in the Kalorama neighborhood of Washington, D.C.[203] Federal filings implied that, in 2017, Trump and her husband may have assets upwards of $740 million.[204] They had previously shared an apartment on Park Avenue in New York City, which Trump chose due to its proximity to her work with the Trump Organization. The residence was featured in Elle Decor in 2012 with Kelly Behun as its interior decorator.[205] Since leaving Washington in 2021, Ivanka and her husband have been residents of Surfside, Florida.[206][207]
Trump began practicing Brazilian jiu-jitsu under the Valente brothers alongside her husband and children.[208]
Trump was in a near four-year relationship with Greg Hersch while in college.[209][210] From 2001 to 2005, she dated James "Bingo" Gubelmann.[211][12][209] In 2005, she started dating the real estate developer Jared Kushner, whom she met through mutual friends.[212][213] The couple broke up in 2008 due to the objections of Kushner's parents[212] but reconciled and married in a Jewish ceremony on October 25, 2009.[212][214] They have three children: a daughter born in July 2011, and two sons born in October 2013 and March 2016 respectively.[215][216][217] In an interview on The Dr. Oz Show, Trump revealed that she had suffered from postpartum depression after each of her pregnancies.[218]
Trump (far right) with (from center to right) her father, stepmother, and husband at the Western Wall at Temple Mount in Jerusalem in May 2017
Raised as a Presbyterian Christian,[219] Trump converted to Orthodox Judaism in July 2009,[220][221] after studying with Elie Weinstock from the Modern OrthodoxRamaz School.[222] Trump took the Hebrew name "Yael" (Hebrew: יָעֵל, lit. ''mountain goat' or ibex').[223][224] She describes her conversion as an "amazing and beautiful journey" which her father supported "from day one", adding that he has "tremendous respect" for the Jewish faith.[225] She attests to keeping a kosher diet and observing the Jewish Sabbath, saying in 2015: "We're pretty observant... It's been such a great life decision for me... I really find that with Judaism, it creates an amazing blueprint for family connectivity. From Friday to Saturday we don't do anything but hang out with one another. We don't make phone calls."[226] When living in New York City, she used to send her daughter to Jewish kindergarten. She said: "It's such a blessing for me to have her come home every night and share with me the Hebrew that she's learned and sing songs for me around the holidays."[225]
In 2012, the Wharton Club of New York, the official Wharton alumni association for the New York metropolitan area,[230] gave Trump the Joseph Wharton Award for Young Leadership, one of their four annual awards for alumni.[231]
^The original designation of "First Daughter" was later dropped from the official title.[125] Ivanka Trump is sometimes also called a 'Senior Advisor to the President' (or sometimes a 'senior advisor to the President', without the upper case 'S' and 'A'),[126][127] even though that is actually the title of her husband Jared Kushner, while her own title is 'Advisor to the President'.[124]
^"About Ivanka". Ivanka Trump. November 14, 2012. Archived from the original on November 14, 2012. Retrieved June 12, 2017. For level of honor see last paragraph of website bio.
^"Coverage of the Republican Convention. Aired 10-11p ET". CNN. Archived from the original on November 30, 2016. Retrieved November 29, 2016. As the proud daughter of your nominee, I am here to tell you that this is the moment and Donald Trump is the person to make America great again!
Dame Elizabeth Rosemond Taylor (February 27, 1932 – March 23, 2011) was a British–American actress. She began her career as a child actress in the early 1940s and was one of the most popular stars of classical Hollywood cinema in the 1950s. She then became the world's highest-paid movie star in the 1960s, remaining a well-known public figure for the rest of her life. In 1999, the American Film Institute ranked her seventh on its greatest female screen legends list.
Born in London to socially prominent American parents, Taylor moved with her family to Los Angeles in 1939 at the age of 7. She made her acting debut with a minor role in the Universal Pictures film There's One Born Every Minute (1942), but the studio ended her contract after a year. She was then signed by Metro-Goldwyn-Mayer and became a popular teen star after appearing in National Velvet (1944). She transitioned to mature roles in the 1950s, when she starred in the comedy Father of the Bride (1950) and received critical acclaim for her performance in the drama A Place in the Sun (1951). She starred in the historical adventure epic Ivanhoe (1952) with Robert Taylor and Joan Fontaine. Despite being one of MGM's most bankable stars, Taylor wished to end her career in the early 1950s. She resented the studio's control and disliked many of the films to which she was assigned.
She began receiving more enjoyable roles in the mid-1950s, beginning with the epic drama Giant (1956), and starred in several critically and commercially successful films in the following years. These included two film adaptations of plays by Tennessee Williams: Cat on a Hot Tin Roof (1958), and Suddenly, Last Summer (1959); Taylor won a Golden Globe for Best Actress for the latter. Although she disliked her role as a call girl in BUtterfield 8 (1960), her last film for MGM, she won the Academy Award for Best Actress for her performance. During the production of the film Cleopatra in 1961, Taylor and co-star Richard Burton began an extramarital affair, which caused a scandal. Despite public disapproval, they continued their relationship and were married in 1964. Dubbed "Liz and Dick" by the media, they starred in 11 films together, including The V.I.P.s (1963), The Sandpiper (1965), The Taming of the Shrew (1967), and Who's Afraid of Virginia Woolf? (1966). Taylor received the best reviews of her career for Woolf, winning her second Academy Award and several other awards for her performance. She and Burton divorced in 1974 but reconciled soon after, remarrying in 1975. The second marriage ended in divorce in 1976.
Taylor's acting career began to decline in the late 1960s, although she continued starring in films until the mid-1970s, after which she focused on supporting the career of her sixth husband, United States Senator John Warner. In the 1980s, she acted in her first substantial stage roles and in several television films and series. She became the second celebrity to launch a perfume brand after Sophia Loren. Taylor was one of the first celebrities to take part in HIV/AIDS activism. She co-founded the American Foundation for AIDS Research in 1985 and the Elizabeth Taylor AIDS Foundation in 1991. From the early 1990s until her death, she dedicated her time to philanthropy, for which she received several accolades, including the Presidential Citizens Medal in 2001.
Throughout her career, Taylor's personal life was the subject of constant media attention. She was married eight times to seven men, converted to Judaism, endured several serious illnesses, and led a jet set lifestyle, including assembling one of the most expensive private collections of jewelry in the world. After many years of ill health, Taylor died from congestive heart failure in 2011, at the age of 79.
Two-year old Taylor, mother Sara Sothern, and brother Howard, in 1934
Elizabeth Rosemond Taylor was born on 27 February 1932, at Heathwood, her family's home at 8 Wildwood Road in Hampstead Garden Suburb, northwest London, England.[1]: 3–10 She received dual British–American citizenship at birth as her parents, art dealer Francis Lenn Taylor (1897–1968) and stage actress Sara Sothern (1895–1994), were United States citizens, both originally from Arkansas City, Kansas.[1]: 3–10 [a]
They had moved to London in 1929 and opened an art gallery on Bond Street; their first child, a son named Howard (died 2020), was born the same year.
The family lived in London during Taylor's childhood.[1]: 11–19 Their social circle included artists such as Augustus John and Laura Knight and politicians such as Colonel Victor Cazalet.[1]: 11–19 Cazalet was Taylor's unofficial godfather and an important influence in her early life.[1]: 11–19 She was enrolled in Byron House School, a Montessori school in Highgate, and was raised according to the teachings of Christian Science, the religion of her mother and Cazalet.[1]: 3, 11–19, 20–23
In early 1939, the Taylors decided to return to the United States due to fear of impending war in Europe.[1]: 22–26 United States ambassador Joseph P. Kennedy contacted her father, urging him to return to the US with his family.[5] Sara and the children left first in April 1939 aboard the ocean liner SS Manhattan and moved in with Taylor's maternal grandfather in Pasadena, California.[1]: 22–28 [6] Francis stayed behind to close the London gallery and joined them in December.[1]: 22–28 In early 1940, he opened a new gallery in Los Angeles. After briefly living in Pacific Palisades, Los Angeles, with the Chapman family, the Taylor family settled in Beverly Hills, California, where the two children were enrolled in Hawthorne School.[1]: 27–34
In California, Taylor's mother was frequently told that her daughter should audition for films.[1]: 27–30 Taylor's eyes in particular drew attention; they were blue, to the extent of appearing violet, and were rimmed by dark double eyelashes caused by a genetic mutation.[7][1]: 9 Sara was initially opposed to Taylor appearing in films, but after the outbreak of war in Europe made return there unlikely, she began to view the film industry as a way of assimilating to American society.[1]: 27–30 Francis Taylor's Beverly Hills gallery had gained clients from the film industry soon after opening, helped by the endorsement of gossip columnist Hedda Hopper, a friend of the Cazalets.[1]: 27–31 Through a client and a school friend's father, Taylor auditioned for both Universal Pictures and Metro-Goldwyn-Mayer in early 1941.[8]: 27–37 Both studios offered Taylor contracts, and Sara Taylor chose to accept Universal's offer.[8]: 27–37
Taylor began her contract in April 1941 and was cast in a small role in There's One Born Every Minute (1942).[8]: 27–37 She did not receive other roles, and her contract was terminated after a year.[8]: 27–37 Universal's casting director explained her dislike of Taylor, stating that "the kid has nothing ... her eyes are too old, she doesn't have the face of a child".[8]: 27–37 Biographer Alexander Walker agrees that Taylor looked different from the child stars of the era, such as Shirley Temple and Judy Garland.[8]: 32 Taylor later said that, "apparently, I used to frighten grown ups, because I was totally direct".[9]
Taylor received another opportunity in late 1942, when her father's acquaintance, MGM producer Samuel Marx, arranged for her to audition for a minor role in Lassie Come Home (1943), which required a child actress with an English accent.[1]: 22–23, 27–37 After a trial contract of three months, she was given a standard seven-year contract in January 1943.[1]: 38–41 Following Lassie, she appeared in minor uncredited roles in two other films set in England – Jane Eyre (1943) playing Helen Burns, and The White Cliffs of Dover (1944).[1]: 38–41
Mickey Rooney and Taylor in National Velvet (1944), her first major film role
Taylor was cast in her first starring role at the age of 12, when she was chosen to play a girl who wants to compete as a jockey in the exclusively male Grand National in National Velvet.[1]: 40–47 She later called it "the most exciting film" of her career.[10] Since 1937, MGM had been looking for a suitable actress with a British accent and the ability to ride horses. They decided on Taylor at the recommendation of White Cliffs director Clarence Brown, who knew she had the necessary skills.[1]: 40–47 At that time Taylor was deemed too short for the role, so filming was delayed several months in order for her to grow an inch or two. In the interim Taylor spent her time practicing her horseback riding.[1]: 40–47
In MGM's effort developing Taylor into a film star, they required her to wear braces to straighten her teeth, and had two of her baby teeth pulled out.[1]: 40–47 The studio also wanted to dye her hair, change the shape of her eyebrows, and proposed that she use the screen name "Virginia", but Taylor and her parents refused.[9]
National Velvet became a box-office success upon its release on Christmas 1944.[1]: 40–47 Bosley Crowther of The New York Times stated that "her whole manner in this picture is one of refreshing grace",[11] while James Agee of The Nation wrote that she "is rapturously beautiful... I hardly know or care whether she can act or not."[12]
Taylor later stated that her childhood ended when she became a star, as MGM started to control every aspect of her life.[9][13][1]: 48–51 She described the studio as a "big extended factory", where she was required to adhere to a strict daily schedule.[9] Her days were spent attending school, and filming at the studio lot. In the evenings, Taylor took dancing and singing classes, and practiced the following day's scenes.[1]: 48–51 Following the success of National Velvet, MGM gave Taylor a new seven-year contract with a weekly salary of $750. They cast her in a minor role in the third film of the Lassie series, Courage of Lassie (1946).[1]: 51–58 MGM also published a book of Taylor's writings about her pet chipmunk, Nibbles and Me (1946), and had paper dolls and coloring books made in her likeness.[1]: 51–58
When Taylor turned 15 in 1947, MGM began to cultivate a more mature public image for her by organizing photo shoots and interviews that portrayed her as a "normal" teenager attending parties and going on dates.[8]: 56–57, 65–74 Film magazines and gossip columnists also began comparing her to older actresses such as Ava Gardner and Lana Turner.[8]: 71 Life called her "Hollywood's most accomplished junior actress" for her two film roles that year.[8]: 69 In the critically panned Cynthia (1947), Taylor portrayed a frail girl who defies her over-protective parents to go to the prom; in the period film Life with Father (1947), opposite William Powell and Irene Dunne, she portrayed the love interest of a stockbroker's son.[14][1]: 58–70 [15]
They were followed by supporting roles as a teenaged "man-stealer" who seduces her peer's date to a high school dance in the musical A Date with Judy (1948), and as a bride in the romantic comedy Julia Misbehaves (1948). This became a commercial success, grossing over $4 million in the box office.[16][1]: 82
Taylor's last adolescent role was as Amy March in Mervyn LeRoy's Little Women (1949), a box-office success.[17] The same year, Time featured Taylor on its cover, and called her the leader among Hollywood's next generation of stars, "a jewel of great price, a true sapphire".[18]
Taylor made the transition to adult roles when she turned 18 in 1950. In her first mature role, the thriller Conspirator (1949), she plays a woman who begins to suspect that her husband is a Soviet spy.[1]: 75–83 Taylor had been only 16 at the time of its filming, but its release was delayed until March 1950, as MGM disliked it and feared it could cause diplomatic problems.[1]: 75–83 [19] Taylor's second film of 1950 was the comedy The Big Hangover (1950), co-starring Van Johnson.[20] It was released in May. That same month, Taylor married hotel-chain heir Conrad "Nicky" Hilton Jr. in a highly publicized ceremony.[1]: 99–105 The event was organized by MGM, and used as part of the publicity campaign for Taylor's next film, Vincente Minnelli's comedy Father of the Bride (1950), in which she appeared opposite Spencer Tracy and Joan Bennett as a bride preparing for her wedding.[1]: 99–105 The film became a box-office success upon its release in June, grossing $6 million worldwide ($78,414,938 in 2024 dollars [21]), and was followed by a successful sequel, Father's Little Dividend (1951), ten months later.[22]
Taylor's next film release, George Stevens' A Place in the Sun (1951), marked a departure from her earlier films. According to Taylor, it was the first film in which she had been asked to act, instead of simply being herself,[13] and it brought her critical acclaim for the first time since National Velvet.[1]: 96–97 Based on Theodore Dreiser's novel An American Tragedy (1925), it featured Taylor as a spoiled socialite who comes between a poor factory worker (Montgomery Clift) and his pregnant girlfriend (Shelley Winters).[1]: 91 Stevens cast Taylor as she was "the only one ... who could create this illusion" of being "not so much a real girl as the girl on the candy-box cover, the beautiful girl in the yellow Cadillac convertible that every American boy sometime or other thinks he can marry."[1]: 92 [23]
A Place in the Sun was a critical and commercial success, grossing $3 million.[24] Herb Golden of Variety said that Taylor's "histrionics are of a quality so far beyond anything she has done previously, that Stevens' skilled hands on the reins must be credited with a minor miracle."[25]A.H. Weiler of The New York Times wrote that she gives "a shaded, tender performance, and one in which her passionate and genuine romance avoids the pathos common to young love as it sometimes comes to the screen."[26]
Taylor next starred in the romantic comedy Love Is Better Than Ever (1952).[1]: 124–125 According to Alexander Walker, MGM cast her in the "B-picture" as a reprimand for divorcing Hilton in January 1951 after only eight months of marriage, which had caused a public scandal that reflected negatively on her.[1]: 124–125 After completing Love Is Better Than Ever, Taylor was sent to Britain to take part in the historical epic Ivanhoe (1952), which was one of the most expensive projects in the studio's history.[1]: 129–132 She was not happy about the project, finding the story superficial and her role as Rebecca too small.[1]: 129–132 Regardless, Ivanhoe became one of MGM's biggest commercial successes, earning $11 million in worldwide rentals.[27]
Taylor's last film made under her old contract with MGM was The Girl Who Had Everything (1953), a remake of the pre-code drama A Free Soul (1931).[1]: 145 Despite her grievances with the studio, Taylor signed a new seven-year contract with MGM in the summer of 1952.[1]: 139–143 Although she wanted more interesting roles, the decisive factor in continuing with the studio was her financial need; she had recently married British actor Michael Wilding, and was pregnant with her first child.[1]: 139–143 In addition to granting her a weekly salary of $4,700 ($55,237 in 2024 dollars [21]), MGM agreed to give the couple a loan for a house, and signed her husband for a three-year contract.[1]: 141–143 Due to her financial dependency, the studio now had even more control over her than previously.[1]: 141–143
Publicity photo, 1954
Taylor's first two films made under her new contract were released ten days apart in early 1954.[1]: 153 The first was Rhapsody, a romantic film starring her as a woman caught in a love triangle with two musicians. The second was Elephant Walk, a drama in which she played a British woman struggling to adapt to life on her husband's tea plantation in Ceylon. She had been loaned to Paramount Pictures for the film after its original star, Vivien Leigh, fell ill.[1]: 148–149
In the fall, Taylor starred in two more film releases. Beau Brummell was a Regency era period film, another project in which she was cast against her will.[1]: 153–154 Taylor disliked historical films in general, as their elaborate costumes and makeup required her to wake up earlier than usual to prepare. She later said that she gave one of the worst performances of her career in Beau Brummell.[1]: 153–154 The second film was Richard Brooks' The Last Time I Saw Paris, based on F. Scott Fitzgerald's short story. Although she had wanted to be cast in The Barefoot Contessa (1954) instead, Taylor liked the film, and later stated that it "convinced me I wanted to be an actress instead of yawning my way through parts."[1]: 153–157 [28] While The Last Time I Saw Paris was not as profitable as many other MGM films, it garnered positive reviews.[1]: 153–157 [28] Taylor became pregnant again during the production, and had to agree to add another year to her contract to make up for the period spent on maternity leave.[1]: 153–157
By the mid-1950s, the American film industry was beginning to face serious competition from television, which resulted in studios producing fewer films, and focusing instead on their quality.[8]: 158–165 The change benefited Taylor, who finally found more challenging roles after several years of career disappointments.[8]: 158–165 After lobbying director George Stevens, she won the female lead role in Giant (1956), an epic drama about a ranching dynasty, which co-starred Rock Hudson and James Dean.[8]: 158–165 Its filming in Marfa, Texas, was a difficult experience for Taylor, as she clashed with Stevens, who wanted to break her will to make her easier to direct, and was often ill, resulting in delays.[8]: 158–165 [29] To further complicate the production, Dean died in a car accident only days after completing filming; the grieving Taylor still had to film reaction shots to their joint scenes.[8]: 158–166 When Giant was released a year later, it became a box-office success, and was widely praised by critics.[8]: 158–165 Although not nominated for an Academy Award like her co-stars, Taylor garnered positive reviews for her performance, with Variety calling it "surprisingly clever",[30] and The Manchester Guardian lauding her acting as "an astonishing revelation of unsuspected gifts." It named her one of the film's strongest assets.[31]
MGM reunited Taylor with Montgomery Clift in Raintree County (1957), a Civil War drama which it hoped would replicate the success of Gone with the Wind (1939).[1]: 166–177 Taylor found her role as a mentally disturbed Southern belle fascinating, but overall disliked the film.[1]: 166–177 Although the film failed to become the type of success MGM had planned,[32] Taylor was nominated for the first time for an Academy Award for Best Actress for her performance.[33]
Taylor considered her next performance as Maggie the Cat in the screen adaptation of the Tennessee Williams play Cat on a Hot Tin Roof (1958) a career "high point." But it coincided with one of the most difficult periods in her personal life.[13] After completing Raintree County, she had divorced Wilding and married producer Mike Todd. She had completed only two weeks of filming in March 1958, when Todd was killed in a plane crash.[1]: 186–194 Although she was devastated, pressure from the studio and the knowledge that Todd had large debts led Taylor to return to work only three weeks later.[1]: 195–203 She later said that "in a way ... [she] became Maggie", and that acting "was the only time I could function" in the weeks after Todd's death.[13]
During the production, Taylor's personal life drew more attention when she began an affair with singer Eddie Fisher, whose marriage to actress Debbie Reynolds had been idealized by the media as the union of "America's sweethearts."[1]: 203–210 The affair – and Fisher's subsequent divorce – changed Taylor's public image from a grieving widow to a "homewrecker". MGM used the scandal to its advantage by featuring an image of Taylor posing on a bed in a slip in the film's promotional posters.[1]: 203–210 Cat grossed $10 million in American cinemas alone, and made Taylor the year's second-most profitable star.[1]: 203–210 She received positive reviews for her performance, with Bosley Crowther of The New York Times calling her "terrific",[34] and Variety praising her for "a well-accented, perceptive interpretation."[35] Taylor was nominated for an Academy Award[33] and a BAFTA.[36]
Taylor's next film, Joseph L. Mankiewicz's Suddenly, Last Summer (1959), was another Tennessee Williams adaptation, with a screenplay by Gore Vidal and also starring Montgomery Clift and Katharine Hepburn. The independent production earned Taylor $500,000 for playing the role of a severely traumatized patient in a mental institution.[1]: 203–210 Although the film was a drama about mental illness, childhood traumas, and homosexuality, it was again promoted with Taylor's sex appeal; both its trailer and poster featured her in a white swimsuit. The strategy worked, as the film was a financial success.[37] Taylor received her third Academy Award nomination[33] and her first Golden Globe for Best Actress for her performance.[1]: 203–210
By 1959, Taylor owed one more film for MGM, which it decided should be BUtterfield 8 (1960), a drama about a high-class call girl, in an adaptation of a John O'Hara 1935 novel of the same name.[1]: 211–223 The studio correctly calculated that Taylor's public image would make it easy for audiences to associate her with the role.[1]: 211–223 She hated the film for the same reason, but had no choice in the matter, although the studio agreed to her demands of filming in New York and casting Eddie Fisher in a sympathetic role.[1]: 211–223 As predicted, BUtterfield 8 was a major commercial success, grossing $18 million in world rentals.[1]: 224–236 Crowther wrote that Taylor "looks like a million dollars, in mink or in negligée",[38] while Variety stated that she gives "a torrid, stinging portrayal with one or two brilliantly executed passages within."[39] Taylor won her first Academy Award for Best Actress for her performance.[1]: 224–236
1961–1967: Cleopatra and other collaborations with Richard Burton
Richard Burton as Mark Antony with Taylor as Cleopatra in Cleopatra (1963)
After completing her MGM contract, Taylor starred in 20th Century-Fox's Cleopatra (1963). According to film historian Alexander Doty, this historical epic made her more famous than ever before.[40] She became the first movie star to be paid $1 million for a role; Fox also granted her 10% of the film's gross profits, as well as shooting the film in Todd-AO, a widescreen format for which she had inherited the rights from Mike Todd.[8]: 10–11 [1]: 211–223 The film's production – characterized by costly sets and costumes, constant delays, and a scandal caused by Taylor's extramarital affair with her co-star Richard Burton – was closely followed by the media, with Life proclaiming it the "Most Talked About Movie Ever Made."[8]: 11–12, 39, 45–46, 56 Filming began in England in 1960, but had to be halted several times because of bad weather and Taylor's ill health.[8]: 12–13 In March 1961, she developed nearly fatal pneumonia, which necessitated a tracheotomy; one news agency erroneously reported that she had died.[8]: 12–13 Once she had recovered, Fox discarded the already filmed material, and moved the production to Rome, changing its director to Joseph Mankiewicz, and the actor playing Mark Antony to Burton.[8]: 12–18 Filming was finally completed in July 1962.[8]: 39 The film's final cost was $62 million (equivalent to $644 million in 2024), making it the most expensive film made up to that point.[8]: 46
Cleopatra became the biggest box-office success of 1963 in the United States; the film grossed $15.7 million at the box office (equivalent to $161 million in 2024).[8]: 56–57 Regardless, it took several years for the film to earn back its production costs, which drove Fox near to bankruptcy. The studio publicly blamed Taylor for the production's troubles and unsuccessfully sued Burton and Taylor for allegedly damaging the film's commercial prospects with their behavior.[8]: 46 The film's reviews were mixed to negative, with critics finding Taylor overweight and her voice too thin, and unfavorably comparing her with her classically trained British co-stars.[8]: 56–58 [1]: 265–267 [41] In retrospect, Taylor called Cleopatra a "low point" in her career, and said that the studio had cut out the scenes which she felt provided the "core of the characterization."[13]
Taylor intended to follow Cleopatra by headlining an all-star cast in Fox's black comedy What a Way to Go! (1964), but negotiations fell through, and Shirley MacLaine was cast instead. In the meantime, film producers were eager to profit from the scandal surrounding Taylor and Burton, and they next starred together in Anthony Asquith's The V.I.P.s (1963), which mirrored the headlines about them.[8]: 42–45 [1]: 252–255, 260–266 Taylor played a famous model attempting to leave her husband for a lover, and Burton her estranged millionaire husband. Released soon after Cleopatra, it became a box-office success.[1]: 264 Taylor was also paid $500,000 (equivalent to $5.14 million in 2024) to appear in a CBS television special, Elizabeth Taylor in London, in which she visited the city's landmarks and recited passages from the works of famous British writers.[8]: 74–75
Taylor and Burton in The Sandpiper (1965)
After completing The V.I.P.s, Taylor took a two-year hiatus from films, during which she and Burton divorced their spouses and married each other.[8]: 112 The supercouple continued starring together in films in the mid-1960s, earning a combined $88 million over the next decade; Burton once stated, "They say we generate more business activity than one of the smaller African nations."[8]: 193 [42] Biographer Alexander Walker compared these films to "illustrated gossip columns", as their film roles often reflected their public personae, while film historian Alexander Doty has noted that the majority of Taylor's films during this period seemed to "conform to, and reinforce, the image of an indulgent, raucous, immoral or amoral, and appetitive (in many senses of the word) 'Elizabeth Taylor'".[1]: 294 [43] Taylor and Burton's first joint project following her hiatus was Vincente Minnelli's romantic drama The Sandpiper (1965), about an illicit love affair between a bohemian artist and a married clergyman in Big Sur, California. Its reviews were largely negative, but it grossed a successful $14 million in the box office (equivalent to $140 million in 2024).[8]: 116–118
Their next project, Who's Afraid of Virginia Woolf? (1966), an adaptation of a play of the same name by Edward Albee, featured the most critically acclaimed performance of Taylor's career.[8]: 142, 151–152 [1]: 286 She and Burton starred as Martha and George, a middle-aged couple going through a marital crisis. In order to convincingly play 50-year-old Martha, Taylor gained weight, wore a wig, and used makeup to make herself look older and tired – in stark contrast to her public image as a glamorous film star.[8]: 136–137 [1]: 281–282 At Taylor's suggestion, theatre director Mike Nichols was hired to direct the project, despite his lack of experience with film.[8]: 139–140 The production differed from anything she had done previously, as Nichols wanted to thoroughly rehearse the play before beginning filming.[8]: 141 Woolf was considered ground-breaking for its adult themes and uncensored language, and opened to "glorious" reviews.[8]: 140, 151 Variety wrote that Taylor's "characterization is at once sensual, spiteful, cynical, pitiable, loathsome, lustful, and tender."[44]Stanley Kauffmann of The New York Times stated that she "does the best work of her career, sustained and urgent."[45] The film also became one of the biggest commercial successes of the year.[8]: 151–152 [1]: 286 Taylor received her second Academy Award, and BAFTA, National Board of Review, and New York City Film Critics Circle awards for her performance.
Taylor and Burton in 1965
In 1966, Taylor and Burton performed Doctor Faustus for a week in Oxford to benefit the Oxford University Dramatic Society; he starred and she appeared in her first stage role as Helen of Troy, a part which required no speaking.[8]: 186–189 Although it received generally negative reviews, Burton produced it as a film, Doctor Faustus (1967), with the same cast.[8]: 186–189 It was also panned by critics and grossed only $600,000 in the box office (equivalent to $5.66 million in 2024).[8]: 230–232 Taylor and Burton's next project, Franco Zeffirelli's The Taming of the Shrew (1967), which they also co-produced, was more successful.[8]: 164 It posed another challenge for Taylor, as she was the only actor in the project with no previous experience of performing Shakespeare; Zeffirelli later stated that this made her performance interesting, as she "invented the part from scratch."[8]: 168 Critics found the play to be fitting material for the couple, and the film became a box-office success by grossing $12 million (equivalent to $113.16 million in 2024).[8]: 181, 186
Taylor's third film released in 1967, John Huston's Reflections in a Golden Eye, was her first without Burton since Cleopatra. Based on a novel of the same name by Carson McCullers, it was a drama about a repressed gay military officer and his unfaithful wife. It was originally slated to co-star Taylor's old friend Montgomery Clift, whose career had been in decline for several years owing to his substance abuse problems. Determined to secure his involvement in the project, Taylor even offered to pay for his insurance.[8]: 157–161 But Clift died from a heart attack before filming began; he was replaced in the role by Marlon Brando.[8]: 175, 189 Reflections was a critical and commercial failure at the time of its release.[8]: 233–234 Taylor and Burton's last film of the year was the adaptation of Graham Greene's novel, The Comedians, which received mixed reviews and was a box-office disappointment.[8]: 228–232
Taylor's career was in decline by the late 1960s. She had gained weight, was in her late 30s and did not fit in with New Hollywood stars such as Jane Fonda and Julie Christie.[8]: 135–136 [1]: 294–296, 307–308 After several years of nearly constant media attention, the public was tiring of Burton and her, and criticized their jet set lifestyle.[8]: 142, 151–152 [1]: 294–296, 305–306 In 1968, Taylor starred in two films directed by Joseph Losey – Boom! and Secret Ceremony – both of which were critical and commercial failures.[8]: 238–246 The former, based on Tennessee Williams' The Milk Train Doesn't Stop Here Anymore, features her as an ageing, serial-marrying millionaire, and Burton as a younger man who turns up on the Mediterranean island on which she has retired.[8]: 211–217 Secret Ceremony is a psychological drama that also stars Mia Farrow and Robert Mitchum.[8]: 242–243, 246 Taylor's third film with George Stevens, The Only Game in Town (1970), in which she played a Las Vegas showgirl who has an affair with a compulsive gambler, played by Warren Beatty, was unsuccessful.[8]: 287 [46]
The three 1972 films in which Taylor acted were somewhat more successful. X Y & Zee, which portrayed Michael Caine and her as a troubled married couple, won her the David di Donatello for Best Foreign Actress. She appeared with Burton in the adaptation of Dylan Thomas's Under Milk Wood; although her role was small, the producers decided to give her top-billing to profit from her fame.[8]: 313–316 Her third film role that year was playing a blonde diner waitress in Peter Ustinov's Faust parody Hammersmith Is Out, her tenth collaboration with Burton. Although it was overall not successful,[8]: 316 Taylor received some good reviews, with Vincent Canby of The New York Times writing that she has "a certain vulgar, ratty charm",[47] and Roger Ebert of the Chicago Sun-Times saying, "The spectacle of Elizabeth Taylor growing older and more beautiful continues to amaze the population."[48] Her performance won the Silver Bear for Best Actress at the Berlin Film Festival.[46]
In Divorce His, Divorce Hers (1973), Taylor's last film with Burton
Taylor and Burton's last film together was the Harlech Television film Divorce His, Divorce Hers (1973), fittingly named as they divorced the following year.[8]: 357 Her other films released in 1973 were the British thriller Night Watch (1973) and the American drama Ash Wednesday (1973).[8]: 341–349, 357–358 For the latter, in which she starred as a woman who undergoes multiple plastic surgeries in an attempt to save her marriage, she received a Golden Globe nomination.[49] Her only film released in 1974, the Italian Muriel Spark adaptation The Driver's Seat (1974), was a failure.[8]: 371–375
Taylor took fewer roles after the mid-1970s, and focused on supporting the career of her sixth husband, Republican politician John Warner, a US senator. In 1976, she participated in the Soviet-American fantasy film The Blue Bird (1976), a critical and box-office failure, and had a small role in the television film Victory at Entebbe (1976). In 1977, she sang in the critically panned film adaptation of Stephen Sondheim's musical A Little Night Music (1977).[8]: 388–389, 403
After a period of semi-retirement from films, Taylor starred in The Mirror Crack'd (1980), adapted from an Agatha Christie mystery novel and featuring an ensemble cast of actors from the studio era, such as Angela Lansbury, Kim Novak, Rock Hudson, and Tony Curtis.[8]: 435 Wanting to challenge herself, she took on her first substantial stage role, playing Regina Giddens in a Broadway production of Lillian Hellman's The Little Foxes.[8]: 411 [1]: 347–362 Instead of portraying Giddens in negative light, as had often been the case in previous productions, Taylor's idea was to show her as a victim of circumstance, explaining, "She's a killer, but she's saying, 'Sorry fellas, you put me in this position'."[1]: 349
The production premiered in May 1981, and had a sold-out six-month run despite mixed reviews.[8]: 411 [1]: 347–362 Frank Rich of The New York Times wrote that Taylor's performance as "Regina Giddens, that malignant Southern bitch-goddess ... begins gingerly, soon gathers steam, and then explodes into a black and thunderous storm that may just knock you out of your seat",[50] while Dan Sullivan of the Los Angeles Times stated, "Taylor presents a possible Regina Giddens, as seen through the persona of Elizabeth Taylor. There's some acting in it, as well as some personal display."[51] She appeared as evil socialite Helena Cassadine in the day-time soap opera General Hospital in November 1981.[1]: 347–362 The following year, she continued performing The Little Foxes in London's West End, but received largely negative reviews from the British press.[1]: 347–362
Encouraged by the success of The Little Foxes, Taylor and producer Zev Buffman founded the Elizabeth Taylor Repertory Company.[1]: 347–362 Its first and only production was a revival of Noël Coward's comedy Private Lives, starring Taylor and Burton.[8]: 413–425 [1]: 347–362 [52] It premiered in Boston in early 1983, and although commercially successful, received generally negative reviews, with critics noting that both stars were in noticeably poor health – Taylor admitted herself to a drug and alcohol rehabilitation center after the play's run ended, and Burton died the following year.[8]: 413–425 [1]: 347–362 After the failure of Private Lives, Taylor dissolved her theatre company.[53] Her only other project that year was the television film Between Friends.[54]
From the mid-1980s, Taylor acted mostly in television productions. She made cameos in the soap operas Hotel and All My Children in 1984, and played a brothel keeper in the historical mini-series North and South in 1985.[8]: 363–373 She also starred in several television films, playing gossip columnist Louella Parsons in Malice in Wonderland (1985), a fading movie star in the drama There Must Be a Pony (1986),[55] and a character based on Poker Alice in the eponymous Western (1987).[1]: 363–373 She re-united with director Franco Zeffirelli to appear in his French-Italian biopic Young Toscanini (1988), and had the last starring role of her career in a television adaptation of Sweet Bird of Youth (1989), her fourth Tennessee Williams play.[1]: 363–373 During this time, she also began receiving honorary awards for her career – the Cecil B. DeMille Award in 1985,[49] and the Film Society of Lincoln Center's Chaplin Award in 1986.[56]
Taylor was one of the first celebrities to participate in HIV/AIDS activism and helped to raise more than $270 million for the cause since the mid-1980s.[65] She began her philanthropic work after becoming frustrated with the fact that very little was being done to combat the disease despite the media attention.[66] She later explained for Vanity Fair that she "decided that with my name, I could open certain doors, that I was a commodity in myself – and I'm not talking as an actress. I could take the fame I'd resented and tried to get away from for so many years – but you can never get away from it – and use it to do some good. I wanted to retire, but the tabloids wouldn't let me. So, I thought: If you're going to screw me over, I'll use you."[67]
Congresswoman Nancy Pelosi (left) alongside Taylor (right), who is testifying in 1990 before the House Budget Committee on HIV-AIDS Funding
Taylor began her philanthropic efforts in 1984, helping to organize and by hosting the first AIDS fundraiser to benefit the AIDS Project Los Angeles.[67][68] In August 1985, she and Michael Gottlieb founded the National AIDS Research Foundation after her friend and former co-star Rock Hudson announced that he was dying of the disease.[67][68] The following month, the foundation merged with Mathilde Krim's AIDS foundation to form the American Foundation for AIDS Research (amfAR).[69][70] As amfAR's focus is on research funding, Taylor founded the Elizabeth Taylor AIDS Foundation (ETAF) in 1991 to raise awareness and to provide support services for people with HIV/AIDS, paying for its overhead costs herself.[67][68][71] Since her death, her estate has continued to fund ETAF's work, and donates 25% of royalties from the use of her image and likeness to the foundation.[71] In addition to her work for people affected by HIV/AIDS in the United States, Taylor was instrumental in expanding amfAR's operations to other countries; ETAF also operates internationally.[67]
Taylor testified before the Senate and House for the Ryan White Care Act in 1986, 1990, and 1992.[70][72] She persuaded President Ronald Reagan to acknowledge the disease for the first time in a speech in 1987, and publicly criticized presidents George H. W. Bush and Bill Clinton for lack of interest in combatting the disease.[67][68] Taylor also founded the Elizabeth Taylor Medical Center to offer free HIV/AIDS testing and care at the Whitman-Walker Clinic in Washington, DC, and the Elizabeth Taylor Endowment Fund for the UCLA Clinical AIDS Research and Education Center in Los Angeles.[70] In 2015, Taylor's business partner Kathy Ireland claimed that Taylor ran an illegal "underground network" that distributed medications to Americans suffering from HIV/AIDS during the 1980s, when the Food and Drug Administration had not yet approved them.[73] The claim was challenged by several people, including amfAR's former vice-president for development and external affairs, Taylor's former publicist, and activists who were involved in Project Inform in the 1980s and 1990s.[74]
Taylor promoting her first fragrance, Passion, in 1987
Taylor created a collection of fragrances whose unprecedented success helped establish the trend of celebrity-branded perfumes in later years.[75][76][77] In collaboration with Elizabeth Arden, Inc., she began by launching two best-selling perfumes – Passion in 1987, and White Diamonds in 1991.[76] Taylor personally supervised the creation and production of each of the 11 fragrances marketed in her name.[76] According to biographers Sam Kashner and Nancy Schoenberger, she earned more money through the fragrance collection than during her entire acting career,[8]: 436 and upon her death, the British newspaper The Guardian estimated that the majority of her estimated $600 million-$1 billion estate consisted of revenue from fragrances.[76] In 2005, Taylor also founded a jewelry company, House of Taylor, in collaboration with Kathy Ireland and Jack and Monty Abramov.[78]
Throughout her adult years, Taylor's personal life, especially her eight marriages (two to the same man), drew a large amount of media attention and public disapproval. According to biographer Alexander Walker, "Whether she liked it or not ... marriage is the matrix of the myth that began surrounding Elizabeth Taylor from [when she was sixteen]."[1]: 126 In 1948, MGM arranged for her to date American football champion Glenn Davis and she announced plans for them to marry once he returned from Korea.[79] The following year, Taylor was briefly engaged to William Pawley Jr., son of US ambassador William D. Pawley.[80][1]: 75–88 Film tycoon Howard Hughes also wanted to marry her, and offered to pay her parents a six-figure sum of money if she were to become his wife.[1]: 81–82 Taylor declined the offer, but was otherwise eager to marry young, as her "rather puritanical upbringing and beliefs" made her believe that "love was synonymous with marriage."[13] Taylor later described herself as being "emotionally immature" during this time due to her sheltered childhood, and believed that she could gain independence from her parents and MGM through marriage.[13]
Taylor was 18 years old when she married Conrad "Nicky" Hilton Jr., heir to the Hilton Hotels chain, at the Church of the Good Shepherd in Beverly Hills on May 6, 1950.[81][1]: 106–112 MGM organized the large and expensive wedding, which became a major media event.[1]: 106–112 In the weeks after their wedding, Taylor realized that she had made a mistake; not only did she and Hilton have few interests in common, but he was also abusive and a heavy drinker.[1]: 113–119 Taylor suffered a miscarriage during one of his violent outbursts.[82][83][84] She announced their separation on December 14, 1950,[85] and was granted a divorce on the grounds of mental cruelty on January 29, 1951, eight months after their wedding.[86][1]: 120–125
Taylor married her second husband, British actor Michael Wilding – a man 20 years her senior – in a low-key ceremony at Caxton Hall in London on February 21, 1952.[1]: 139 She had first met him in 1948 while filming The Conspirator in England, and their relationship began when she returned to film Ivanhoe in 1951.[1]: 131–133 Taylor found their age gap appealing. She wanted "the calm and quiet and security of friendship" from their relationship;[13] he hoped that the marriage would aid his career in Hollywood.[1]: 136 They had two sons: Michael Howard (born January 6, 1953) and Christopher Edward (born February 27, 1955; Taylor's 23rd birthday).[1]: 148, 160 As Taylor grew older and more confident in herself, she began to drift apart from Wilding, whose failing career was also a source of marital strife.[1]: 160–165 When she was away filming Giant in 1955, gossip magazine Confidential caused a scandal by claiming that he had entertained strippers at their home.[1]: 164–165 Taylor and Wilding announced their separation on July 18, 1956, and were divorced on January 26, 1957.[87]
Taylor with her third husband Mike Todd and her three children in 1957
Taylor was three months pregnant when she married her third husband, theatre and film producer Mike Todd, in Acapulco, Guerrero, Mexico, on February 2, 1957.[1]: 178–180 They had one daughter, Elizabeth "Liza" Frances (born August 6, 1957).[1]: 186 Todd, known for publicity stunts, encouraged the media attention to their marriage; for example, in June 1957, he threw a birthday party at Madison Square Garden, which was attended by 18,000 guests and broadcast on CBS.[8]: 5–6 [1]: 188 His death in a plane crash on March 22, 1958, left Taylor devastated.[8]: 5–6 [1]: 193–202 She was comforted by a friend of Todd's and hers, singer Eddie Fisher, with whom she soon began an affair.[8]: 7–9 [1]: 201–210 Fisher was still married to actress Debbie Reynolds. The affair resulted in a public scandal, with Taylor being branded a "homewrecker."[8]: 7–9 [1]: 201–210 Taylor and Fisher were married at the Temple Beth Sholom in Las Vegas on May 12, 1959; she later stated that she married him only due to her grief.[8]: 7–9 [1]: 201–210 [13] Taylor and Reynolds would reconcile in the 1960s.[88]
While filming Cleopatra in Italy in 1962, Taylor began an affair with her co-star, Welsh actor Richard Burton, although Burton was also married. Rumors about the affair began to circulate in the press, and were confirmed by a paparazzi shot of them on a yacht in Ischia.[8]: 27–34 According to sociologist Ellis Cashmore, the publication of the photograph was a "turning point", beginning a new era in which it became difficult for celebrities to keep their personal lives separate from their public images.[89] The scandal caused Taylor and Burton to be condemned for "erotic vagrancy" by the Vatican, with calls also in the US Congress to bar them from re-entering the country.[8]: 36 Taylor was granted a divorce from Fisher on March 5, 1964, in Puerto Vallarta, Jalisco, Mexico, and married Burton 10 days later in a private ceremony at the Ritz-Carlton Montreal.[8]: 99–100 Burton subsequently adopted Liza Todd and Maria McKeown (born 1961), a German orphan whose adoption process Taylor had begun while married to Fisher.[90][91]
Dubbed "Liz and Dick" by the media, Taylor and Burton starred together in 11 films, and led a jet-set lifestyle, spending millions on "furs, diamonds, paintings, designer clothes, travel, food, liquor, a yacht, and a jet."[8]: 193 Sociologist Karen Sternheimer states that they "became a cottage industry of speculation about their alleged life of excess. From reports of massive spending [...] affairs, and even an open marriage, the couple came to represent a new era of 'gotcha' celebrity coverage, where the more personal the story, the better."[92] They divorced for the first time in June 1974, but reconciled, and remarried in Kasane, Botswana, on 10 October 1975.[8]: 376, 391–394 The second marriage lasted less than a year, ending in divorce in July 1976.[8]: 384–385, 406 Taylor and Burton's relationship was often referred to as the "marriage of the century" by the media, and she later stated, "After Richard, the men in my life were just there to hold the coat, to open the door. All the men after Richard were really just company."[8]: vii, 437 Soon after her final divorce from Burton, Taylor met her sixth husband, John Warner, a Republican politician from Virginia.[8]: 402–405 They were married on 4 December 1976, after which Taylor concentrated on working for his electoral campaign.[8]: 402–405 Once Warner had been elected to the Senate, she started to find her life as a politician's wife in Washington, D.C. boring and lonely, becoming depressed, overweight, and increasingly addicted to prescription drugs and alcohol.[8]: 402–405 Taylor and Warner separated in December 1981, and divorced on 5 November 1982.[8]: 410–411
After the divorce from Warner, Taylor dated actors Anthony Geary[93] and George Hamilton,[94] and was engaged to Mexican lawyer Victor Luna in 1983–1984,[8]: 422–434 and New York businessman Dennis Stein in 1985.[95] She met her seventh and last husband, construction worker Larry Fortensky, at the Betty Ford Center in 1988.[8]: 437 [1]: 465–466 They were married at the Neverland Ranch of her close friend Michael Jackson on October 6, 1991.[96] The wedding was again subject to intense media attention, with one photographer parachuting to the ranch and Taylor selling the wedding pictures to People for $1 million (equivalent to $2.31 million in 2024), which she used to start her AIDS foundation.[97][70] Taylor and Fortensky divorced on October 31, 1996,[8]: 437 but remained in contact for life.[98] She attributed the split to her painful hip operations and his obsessive-compulsive disorder.[99][100] In the winter of 1999, Fortensky underwent brain surgery after falling off a balcony and was comatose for six weeks; Taylor immediately notified the hospital she would personally guarantee his medical expenses.[101] At the end of 2010, she wrote him a letter that read: "You’re a part of my life that cannot be carved out nor do I ever wish it to be."[102] Taylor's last phone call with Fortensky was on February 7, 2011, one day before she checked into the hospital for what turned out to be her final stay. He told her she would outlive him.[103] Although they had been divorced for almost 15 years, Taylor left Fortensky $825,000 in her will.[104]
In the last years of her life, she had a platonic friendship with the actor Colin Farrell. On the phone, they often talked about the topic of insomnia and how to deal with it.[105]
Taylor was raised as a Christian Scientist, and converted to Judaism in 1959.[8]: 173–174 [1]: 206–210 Although two of her husbands – Mike Todd and Eddie Fisher – were Jewish, Taylor stated that she did not convert because of them, and had wanted to do so "for a long time",[106] and that there was "comfort and dignity and hope for me in this ancient religion that [has] survived for four thousand years... I feel as if I have been a Jew all my life."[107] Walker believed that Taylor was influenced in her decision by her godfather, Victor Cazalet, and her mother, who were active supporters of Zionism during her childhood.[1]: 14
Following her conversion, Taylor became an active supporter of Jewish and Zionist causes.[108][109] In 1959, she purchased $100,000 worth of Israeli bonds, which led to her films being banned by Arab countries throughout the Middle East and Africa.[110][109] She was also barred from entering Egypt to film Cleopatra in 1962, but the ban was lifted two years later after the Egyptian officials deemed that the film brought positive publicity for the country.[108] In addition to purchasing bonds, Taylor helped to raise money for organizations such as the Jewish National Fund,[108] and sat on the board of trustees of the Simon Wiesenthal Center.[111]
Taylor is considered a fashion icon both for her film costumes and personal style.[112][113][114] At MGM, her costumes were mostly designed by Helen Rose and Edith Head,[115] and in the 1960s by Irene Sharaff.[113][116] Her most famous costumes include a white ball gown in A Place in the Sun (1951), a Grecian dress in Cat on a Hot Tin Roof (1958), a green A-line dress in Suddenly Last Summer (1959), and a slip and a fur coat in BUtterfield 8 (1960).[112][113][114] Her look in Cleopatra (1963) started a trend for "cat-eye" makeup done with black eyeliner.[8]: 135–136
Taylor collected jewelry through her life, and owned the 33.19-carat (6.638 g) Krupp Diamond, the 69.42-carat (13.884 g) Taylor-Burton Diamond, and the 50-carat (10 g) La Peregrina Pearl, all three of which were gifts from husband Richard Burton.[8]: 237–238, 258–259, 275–276 She also published a book about her collection, My Love Affair with Jewelry, in 2002.[113][117] Taylor helped to popularise the work of fashion designers Valentino Garavani[115][118] and Halston.[113][119] She received a Lifetime of Glamour Award from the Council of Fashion Designers of America (CFDA) in 1997.[120] After her death, her jewelry and fashion collections were auctioned by Christie's to benefit her AIDS foundation, ETAF. The jewelry sold for a record-breaking sum of $156.8 million,[121] and the clothes and accessories for a further $5.5 million.[122]
Taylor struggled with health problems for most of her life.[65] She was born with scoliosis[123] and broke her back while filming National Velvet in 1944.[1]: 40–47 The fracture went undetected for several years, although it caused her chronic back problems.[1]: 40–47 In 1956, she underwent an operation in which some of her spinal discs were removed and replaced with donated bone.[1]: 175 Taylor was also prone to other illnesses and injuries, which often necessitated surgery; in 1961, she survived a near-fatal bout of pneumonia that required a tracheotomy.[8]
She was treated for the pneumonia with bacteriophage.[124]
In 1968 she underwent an emergency hysterectomy, which exacerbated her back problems and contributed to hip problems. Perhaps self-medicating, she was addicted to alcohol and prescription pain killers and tranquilizers. She was treated at the Betty Ford Center for seven weeks from December 1983 to January 1984, becoming the first celebrity to openly admit herself to the clinic.[8]: 424–425 She relapsed later in the decade and entered rehabilitation again in 1988.[1]: 366–368 Taylor also struggled with her weight – she became overweight in the 1970s, especially after her marriage to Senator John Warner, and published a diet book about her experiences, Elizabeth Takes Off (1988).[125][126] Taylor was a heavy smoker until she experienced a severe bout of pneumonia in 1990.[127]
Taylor's health increasingly declined during the last two decades of her life and she rarely attended public events after 1996.[123] Taylor had serious bouts of pneumonia in 1990 and 2000,[68] two hip replacement surgeries in the mid-1990s,[65] a surgery for a benign brain tumor in 1997,[65] and successful treatment for skin cancer in 2002.[123] She used a wheelchair due to her back problems and was diagnosed with congestive heart failure in 2004.[128][129] She died of the illness aged 79 on March 23, 2011, at Cedars-Sinai Medical Center in Los Angeles, six weeks after being hospitalized.[130] Her funeral took place the following day at the Forest Lawn Memorial Park in Glendale, California. The service was a private Jewish ceremony presided by RabbiJerome Cutler. At Taylor's request, the ceremony began 15 minutes behind schedule, as, according to her representative, "She even wanted to be late for her own funeral."[131] She was entombed in the cemetery's Great Mausoleum.[132]
Taylor lived at 700 Nimes Road in the Bel Air district of Los Angeles from 1982 until her death in 2011. The art photographer Catherine Opie created an eponymous photographic study of the house in 2011.[133]
More than anyone else I can think of, Elizabeth Taylor represents the complete movie phenomenon – what movies are as an art and an industry, and what they have meant to those of us who have grown up watching them in the dark... Like movies themselves, she's grown up with us, as we have with her. She's someone whose entire life has been played in a series of settings forever denied the fourth wall. Elizabeth Taylor is the most important character she's ever played.[134]
—Vincent Canby of The New York Times in 1986
Taylor was one of the last stars of classical Hollywood cinema[135][136] and one of the first modern celebrities.[137] During the era of the studio system, she exemplified the classic film star. She was portrayed as different from "ordinary" people, and her public image was carefully crafted and controlled by MGM.[138] When the era of classical Hollywood ended in the 1960s, and paparazzi photography became a normal feature of media culture, Taylor came to define a new type of celebrity whose real private life was the focus of public interest.[139][140][141] "More than for any film role," Adam Bernstein of The Washington Post wrote, "she became famous for being famous, setting a media template for later generations of entertainers, models, and all variety of semi-somebodies."[142]
Regardless of the acting awards she won during her career, Taylor's film performances were often overlooked by contemporary critics;[10][143] according to film historian Jeanine Basinger, "No actress ever had a more difficult job in getting critics to accept her onscreen as someone other than Elizabeth Taylor... Her persona ate her alive."[142] Her film roles often mirrored her personal life, and many critics continue to regard her as always playing herself, rather than acting.[140][142][144] In contrast, Mel Gussow of The New York Times stated that "the range of [Taylor's] acting was surprisingly wide", despite the fact that she never received any professional training.[10] Film critic Peter Bradshaw called her "an actress of such sexiness it was an incitement to riot – sultry and queenly at the same time", and "a shrewd, intelligent, intuitive acting presence in her later years."[145]David Thomson stated that "she had the range, nerve, and instinct that only Bette Davis had had before – and like Davis, Taylor was monster and empress, sweetheart and scold, idiot and wise woman."[146] Five films in which she starred – Lassie Come Home, National Velvet, A Place in the Sun, Giant, and Who's Afraid of Virginia Woolf? – have been preserved in the National Film Registry, and the American Film Institute has named her the seventh greatest female screen legend.
Bust of Taylor in Puerto Vallarta, Mexico
Taylor has also been discussed by journalists and scholars interested in the role of women in Western society. Camille Paglia writes that Taylor was a "pre-feminist woman" who "wields the sexual power that feminism cannot explain and has tried to destroy. Through stars like Taylor, we sense the world-disordering impact of legendary women like Delilah, Salome, and Helen of Troy."[147] In contrast, cultural critic M.G. Lord calls Taylor an "accidental feminist", stating that while she did not identify as a feminist, many of her films had feminist themes and "introduced a broad audience to feminist ideas."[148][b] Similarly, Ben W. Heineman Jr. and Cristine Russell write in The Atlantic that her role in Giant "dismantled stereotypes about women and minorities."[149]
Taylor is considered a gay icon, and received widespread recognition for her HIV/AIDS activism.[142][150][151][152] After her death, GLAAD issued a statement saying that she "was an icon not only in Hollywood, but in the LGBT community, where she worked to ensure that everyone was treated with the respect and dignity we all deserve",[150] and Sir Nick Partridge of the Terrence Higgins Trust called her "the first major star to publicly fight fear and prejudice towards AIDS."[153] According to Paul Flynn of The Guardian, she was "a new type of gay icon, one whose position is based not on tragedy, but on her work for the LGBTQ community."[154] Speaking of her charity work, former President Bill Clinton said at her death, "Elizabeth's legacy will live on in many people around the world whose lives will be longer and better because of her work and the ongoing efforts of those she inspired."[155]
Since Taylor's death, House of Taylor,[156] Elizabeth Taylor's estate, has preserved Taylor's legacy through content, partnerships, and products. The estate is managed by three trustees selected by Elizabeth prior to her death. They continue to be involved with The Elizabeth Taylor AIDS Foundation[157] and oversee The Elizabeth Taylor Archive.
In 2022, House of Taylor released Elizabeth The First,[158] a 10-part podcast series with Imperative Entertainment and Kitty Purry Productions and narrated by Katy Perry. In December 2022, Elizabeth Taylor: The Grit & Glamour of an Icon by Kate Andersen Brower,[159] the first Elizabeth Taylor biography authorized by the estate, was released.
In 2019, it was announced that Rachel Weisz would portray Taylor in A Special Relationship, an upcoming film about Taylor's journey from actress to activist written by Simon Beaufoy.[160]
In 2024, it was announced that Kim Kardashian would executive produce and feature in a docuseries about Taylor. Commissioned by the BBC, it's been given the working title Elizabeth Taylor: Rebel Superstar.[161]
^In October 1965, as her then-husband Richard Burton was British, she signed an oath of renunciation at the US Embassy in Paris, but with the phrase "abjure all allegiance and fidelity to the United States" struck out. US State Department officials declared that her renunciation was invalid due to the alteration, and Taylor signed another oath, this time without alteration, in October 1966.[2] She applied for restoration of US citizenship in 1977, during then-husband John Warner's Senate campaign, stating she planned to remain in America for the rest of her life.[3][4]
^For example, National Velvet (1944) was about a girl attempting to compete in the Grand National despite gender discrimination; A Place in the Sun (1951) is "a cautionary tale from a time before women had ready access to birth control"; her character in BUtterfield 8 (1960) is shown in control of her sexuality; Who's Afraid of Virginia Woolf? (1966) "depicts the anguish that befalls a woman when the only way she can express herself is through her husband's stalled career and children".[148]
^Weiler, A.H. (August 29, 1951). "A Place in the Sun". The New York Times. Archived from the original on November 24, 2015. Retrieved December 1, 2015.
^Twilley, Nicola (December 14, 2020). "When a Virus Is the Cure". The New Yorker. Retrieved December 15, 2020. Still, as late as 1961, phage therapy had some American adherents, including Elizabeth Taylor, who received a dose of staph bacteriophage when she developed near-fatal pneumonia during the filming of Cleopatra and needed an emergency tracheotomy.
Doty, Alexander (2012). "Elizabeth Taylor: The Biggest Star in the World". In Wojcik, Pamela Robertson (ed.). New Constellations: Movie Stars of the 1960s. Rutgers University Press. ISBN978-0-8135-5171-5.
Daniel, Douglass K. (2011). Tough as Nails: The Life and Films of Richard Brooks. University of Wisconsin Press. ISBN978-0-299-25123-9.
Dye, David (1988). Child and Youth Actors: Filmography of Their Entire Careers, 1914-1985. Jefferson, NC: McFarland & Co., pp. 226–227.
Gehring, Wes D. (2006) [2003]. Irene Dunne: First Lady of Hollywood. Scarecrow Press. ISBN978-0-8108-5864-0.
Heymann, David C. (1995). Liz: An Intimate Biography of Elizabeth Taylor. Birch Lane Press. ISBN1-55972-267-3.
Kashner, Sam; Schoenberger, Nancy (2010). Furious Love: Elizabeth Taylor, Richard Burton, and the Marriage of the Century. JR Books. ISBN978-1-907532-22-1.
Kennedy's father amassed a private fortune and established trust funds for his nine children that guaranteed lifelong financial independence.[7] His business kept him away from home for long stretches, but Joe Sr. was a formidable presence in his children's lives. He encouraged them to be ambitious, emphasized political discussions at the dinner table, and demanded a high level of academic achievement. John's first exposure to politics was touring the Boston wards with his grandfather Fitzgerald during his 1922 failed gubernatorial campaign.[8][9] With Joe Sr.'s business ventures concentrated on Wall Street and Hollywood and an outbreak of polio in Massachusetts, the family decided to move from Boston to the Riverdale neighborhood of New York City in September 1927.[10][11] Several years later, his brother Robert told Look magazine that his father left Boston because of job signs that read: "No Irish Need Apply."[12] The Kennedys spent summers and early autumns at their home in Hyannis Port, Massachusetts, a village on Cape Cod,[13] where they swam, sailed, and played touch football.[14] Christmas and Easter holidays were spent at their winter retreat in Palm Beach, Florida.[15] In September 1930, Kennedy, 13 years old, was sent to the Canterbury School in New Milford, Connecticut, for 8th grade. In April 1931, he had an appendectomy, after which he withdrew from Canterbury and recuperated at home.[16]
In September 1931, Kennedy started attending Choate, a preparatory boarding school in Wallingford, Connecticut.[17] Rose had wanted John and Joe Jr. to attend a Catholic school, but Joe Sr. thought that if they were to compete in the political world, they needed to be with boys from prominent Protestant families.[18] John spent his first years at Choate in his older brother's shadow and compensated with rebellious behavior that attracted a clique. Their most notorious stunt was exploding a toilet seat with a firecracker. In the next chapel assembly, the headmaster, George St. John, brandished the toilet seat and spoke of "muckers" who would "spit in our sea," leading Kennedy to name his group "The Muckers Club," which included roommate and lifelong friend Lem Billings.[19][20] Kennedy graduated from Choate in June 1935, finishing 64th of 112 students.[11] He had been the business manager of the school yearbook and was voted the "most likely to succeed."[19]
Kennedy intended to study under Harold Laski at the London School of Economics, as his older brother had done. Ill health forced his return to the U.S. in October 1935, when he enrolled late at Princeton University, but had to leave after two months due to gastrointestinal illness.[21]
In September 1936, Kennedy enrolled at Harvard College.[22] He wrote occasionally for The Harvard Crimson, the campus newspaper, but had little involvement with campus politics, preferring to concentrate on athletics and his social life. Kennedy played football and was on the JV squad during his sophomore year, but an injury forced him off the team, and left him with back problems that plagued him for the rest of his life. He won membership in the Hasty Pudding Club and the Spee Club, one of Harvard's elite "final clubs".[23][24]
In July 1938, Kennedy sailed overseas with his older brother to work at the American embassy in London, where his father was serving as President Franklin D. Roosevelt's ambassador to the Court of St. James's.[25] The following year, Kennedy traveled throughout Europe, the Soviet Union, the Balkans, and the Middle East in preparation for his Harvard senior honors thesis.[26] He then went to Berlin, where a U.S. diplomatic representative gave him a secret message about war breaking out soon to pass on to his father, and to Czechoslovakia before returning to London on September 1, 1939, the day that Germany invaded Poland; the start of World War II.[27] Two days later, the family was in the House of Commons for speeches endorsing the United Kingdom's declaration of war on Germany. Kennedy was sent as his father's representative to help with arrangements for American survivors of the torpedoing of SS Athenia before flying back to the U.S. on his first transatlantic flight.[28][29]
While Kennedy was an upperclassman at Harvard, he began to take his studies more seriously and developed an interest in political philosophy. He made the dean's list in his junior year.[30] In 1940, Kennedy completed his thesis, "Appeasement in Munich", about British negotiations during the Munich Agreement. The thesis was released on July 24, under the title Why England Slept.[31] The book was one of the first to offer information about the war and its origins, and quickly became a bestseller.[32] In addition to addressing Britain's unwillingness to strengthen its military in the lead-up to the war, the book called for an Anglo-American alliance against the rising totalitarian powers. Kennedy became increasingly supportive of U.S. intervention in World War II, and his father's isolationist beliefs resulted in the latter's dismissal as ambassador.[33]
In 1940, Kennedy graduated cum laude from Harvard with a Bachelor of Arts in government, concentrating on international affairs.[34] That fall, he enrolled at the Stanford Graduate School of Business and audited classes,[35] but he left after a semester to help his father complete his memoirs as an American ambassador. In early 1941, Kennedy toured South America.[36][37]
Kennedy planned to attend Yale Law School, but canceled when American entry into World War II seemed imminent.[38] In 1940, Kennedy attempted to enter the army's Officer Candidate School. Despite months of training, he was medically disqualified due to his chronic back problems. On September 24, 1941, Kennedy, with the help of the director of the Office of Naval Intelligence (ONI) and the former naval attaché to Joe Sr., Alan Kirk, joined the United States Naval Reserve. He was commissioned an ensign on October 26, 1941,[39] and joined the ONI staff in Washington, D.C.[40][41][42]
In April 1943, Kennedy was assigned to Motor Torpedo Squadron TWO,[40] and on April 24 he took command of PT-109,[45] then based on Tulagi Island in the Solomons.[41] On the night of August 1–2, in support of the New Georgia campaign, PT-109 and fourteen other PTs were ordered to block or repel four Japanese destroyers and floatplanes carrying food, supplies, and 900 Japanese soldiers to the Vila Plantation garrison on the southern tip of the Solomon's Kolombangara Island. Intelligence had been sent to Kennedy's Commander Thomas G. Warfield expecting the arrival of the large Japanese naval force that would pass on the evening of August 1. Of the 24 torpedoes fired that night by eight of the American PTs, not one hit the Japanese convoy.[46] On that moonless night, Kennedy spotted a Japanese destroyer heading north on its return from the base of Kolombangara around 2:00 a.m., and attempted to turn to attack, when PT-109 was rammed suddenly at an angle and cut in half by the destroyer Amagiri, killing two PT-109 crew members.[47][48][41][b] Avoiding surrender, the remaining crew swam towards Plum Pudding Island, 3.5 miles (5.6 km) southwest of the remains of PT-109, on August 2.[41][50] Despite re-injuring his back in the collision, Kennedy towed a badly burned crewman to the island with a life jacket strap clenched between his teeth.[51] From there, Kennedy and his subordinate, Ensign George Ross, made forays through the coral islands, searching for help.[52] When they encountered an English-speaking native with a canoe, Kennedy carved his location on a coconut shell and requested a boat rescue. Seven days after the collision, with the coconut message delivered, the PT-109 crew were rescued.[53][54]
Almost immediately, the PT-109 rescue became a highly publicized event. The story was chronicled by John Hersey in The New Yorker in 1944 (decades later it was the basis of a successful film).[54] It followed Kennedy into politics and provided a strong foundation for his appeal as a leader.[55] Hersey portrayed Kennedy as a modest, self-deprecating hero.[56] For his courage and leadership, Kennedy was awarded the Navy and Marine Corps Medal, and the injuries he suffered during the incident qualified him for a Purple Heart.[55]
After a month's recovery Kennedy returned to duty, commanding the PT-59. On November 2, Kennedy's PT-59 took part with two other PTs in the rescue of 40–50 marines. The 59 acted as a shield from shore fire as they escaped on two rescue landing craft at the base of the Warrior River at Choiseul Island, taking ten marines aboard and delivering them to safety.[57] Under doctor's orders, Kennedy was relieved of his command on November 18, and sent to the hospital on Tulagi.[58] By December 1943, with his health deteriorating, Kennedy left the Pacific front and arrived in San Francisco in early January 1944.[59] After receiving treatment for his back injury at the Chelsea Naval Hospital in Massachusetts from May to December 1944, he was released from active duty.[60][40] Beginning in January 1945, Kennedy spent three months recovering from his back injury at Castle Hot Springs, a resort and temporary military hospital in Arizona.[61][62] On March 1, 1945, Kennedy retired from the Navy Reserve on physical disability and was honorably discharged with the full rank of lieutenant.[63] When later asked how he became a war hero, Kennedy joked: "It was easy. They cut my PT boat in half."[64]
On August 12, 1944, Kennedy's older brother, Joe Jr., a navy pilot, was killed on an air mission. His body was never recovered.[65][66] The news reached the family's home in Hyannis Port, Massachusetts, a day later. Kennedy felt that Joe Jr.'s reckless flight was partly an effort to outdo him.[67][68] To console himself, Kennedy set out to assemble a privately published book of remembrances of his brother, As We Remember Joe.[69]
In April 1945, Kennedy's father, who was a friend of William Randolph Hearst, arranged a position for his son as a special correspondent for Hearst Newspapers; the assignment kept Kennedy's name in the public eye and "expose[d] him to journalism as a possible career".[70] That May he went to Berlin as a correspondent,[71] covering the Potsdam Conference and other events.[72]
Kennedy's elder brother Joe Jr. had been the family's political standard-bearer and had been tapped by their father to seek the presidency. After Joe's death, the assignment fell to JFK as the second eldest.[73] Boston mayor Maurice J. Tobin discussed the possibility of John becoming his running mate in 1946 as a candidate for Massachusetts lieutenant governor, but Joe Sr. preferred a congressional campaign that could send John to Washington, where he could have national visibility.[74]
Kennedy (back row, second from right) and Richard Nixon (far right) participate in a radio broadcast as 1947 freshmen House members.
At the urging of Kennedy's father, U.S. Representative James Michael Curley vacated his seat in the strongly Democratic 11th congressional district of Massachusetts to become mayor of Boston in 1946. Kennedy established legal residency at 122 Bowdoin Street across from the Massachusetts State House.[75] Kennedy won the Democratic primary with 42 percent of the vote, defeating nine other candidates.[76] According to Fredrik Logevall, Joe Sr.
spent hours on the phone with reporters and editors, seeking information, trading confidences, and cajoling them into publishing puff pieces on John, ones that invariably played up his war record in the Pacific. He oversaw a professional advertising campaign that ensured ads went up in just the right places the campaign had a virtual monopoly on [Boston] subway space, and on window stickers ("Kennedy for Congress") for cars and homes and was the force behind the mass mailing of Hersey's PT-109 article.[77]
Though Republicans took control of the House in the 1946 elections, Kennedy defeated his Republican opponent in the general election, taking 73 percent of the vote.[78]
As a congressman, Kennedy had a reputation for not taking much interest in the running of his office or his constituents' concerns, with one of the highest absenteeism rates in the House, although much was explained by illness.[79]George Smathers, one of his few political friends at the time, claimed that he was more interested in being a writer than a politician, and at that time he suffered from extreme shyness.[79] Kennedy found "most of his fellow congressmen boring, preoccupied as they all seemed to be with their narrow political concerns". The arcane House rules and customs, which slowed legislation, exasperated him.[80]
Kennedy served in the House for six years, joining the influential Education and Labor Committee and the Veterans' Affairs Committee. He concentrated his attention on international affairs, supporting the Truman Doctrine as the appropriate response to the emerging Cold War. He also supported public housing and opposed the Labor Management Relations Act of 1947, which restricted the power of labor unions. Though not as vocally anti-communist as Joseph McCarthy, Kennedy supported the Immigration and Nationality Act of 1952, which required communists to register with the government, and he deplored the "loss of China".[81] During a speech in Salem, Massachusetts on January 30, 1949, Kennedy denounced Truman and the State Department for contributing to the "tragic story of China whose freedom we once fought to preserve. What our young men had saved [in World War II], our diplomats and our President have frittered away."[82][83] Having served as a boy scout during his childhood, Kennedy was active in the Boston Council from 1946 to 1955 as district vice chairman, member of the executive board, vice-president, and National Council Representative.[84][85]
To appeal to the large Italian-American voting bloc in Massachusetts, Kennedy delivered a speech in November 1947 supporting a $227 million aid package to Italy. He maintained that Italy was in danger from an "onslaught of the communist minority" and that the country was the "initial battleground in the communist drive to capture Western Europe."[86] To combat Soviet efforts to take control in Middle Eastern and Asian countries like Indochina, Kennedy wanted the United States to develop nonmilitary techniques of resistance that would not create suspicions of neoimperialism or add to the country's financial burden. The problem, as he saw it, was not simply to be anti-communist but to stand for something that these emerging nations would find appealing.[87]
Almost every weekend that Congress was in session, Kennedy would fly back to Massachusetts to give speeches to veteran, fraternal, and civic groups, while maintaining an index card file on individuals who might be helpful for a campaign for statewide office.[88] Contemplating whether to run for Massachusetts governor or the U.S. Senate, Kennedy abandoned interest in the former, believing that the governor "sat in an office, handing out sewer contracts".[89]
Campaign slogan for Kennedy's 1952 U.S. Senate campaign in Massachusetts
As early as 1949, Kennedy began preparing to run for the Senate in 1952 against Republican three-term incumbent Henry Cabot Lodge Jr. with the campaign slogan "KENNEDY WILL DO MORE FOR MASSACHUSETTS".[90] Joe Sr. again financed his son's candidacy (persuading the Boston Post to switch its support to Kennedy by promising the publisher a $500,000 loan),[91] while John's younger brother Robert emerged as campaign manager.[92] Kennedy's mother and sisters contributed as highly effective canvassers by hosting a series of "teas" at hotels and parlors across Massachusetts to reach out to women voters.[93][94] In the presidential election, Republican Dwight D. Eisenhower carried Massachusetts by 208,000 votes, but Kennedy narrowly defeated Lodge by 70,000 votes for the Senate seat.[95] The following year, he married Jacqueline Bouvier.[96]
Kennedy underwent several spinal operations over the next two years. Often absent from the Senate, he was at times critically ill and received Catholic last rites. During his convalescence in 1956, he published Profiles in Courage, a book about U.S. senators who risked their careers for their personal beliefs, for which he won the Pulitzer Prize for Biography in 1957.[97] Rumors that this work was ghostwritten by his close adviser and speechwriter, Ted Sorensen, were confirmed in Sorensen's 2008 autobiography.[98] In response to criticism that the book included only men, in 1958 he published an article in the women's magazine McCall's that honored "Three Women of Courage," by adding Jeannette Rankin, Anne Hutchinson, and Prudence Crandall to his Hall of Fame.[99][100]
At the start of his first term, Kennedy focused on fulfilling the promise of his campaign to do "more for Massachusetts" than his predecessor. Although Kennedy's and Lodge's legislative records were similarly liberal, Lodge voted for the Taft-Hartley Act of 1947 and Kennedy voted against it. On NBC's Meet the Press, Kennedy excoriated Lodge for not doing enough to prevent the increasing migration of manufacturing jobs from Massachusetts to the South, and blamed the right-to-work provision for giving the South an unfair advantage over Massachusetts in labor costs.[101] In May 1953, Kennedy introduced "The Economic Problems of New England",[102] a 36-point program[103] to help Massachusetts industries such as fishing, textile manufacturing, watchmaking, and shipbuilding, as well as the Boston seaport.[104] Kennedy's policy agenda included protective tariffs, preventing excessive speculation in raw wool, stronger efforts to research and market American fish products, an increase in the Fish and Wildlife Service budget, modernizing reserve-fleet vessels, tax incentives to prevent further business relocations, and the development of hydroelectric and nuclear power in Massachusetts.[105][106][107] Kennedy's suggestions for stimulating the region's economy appealed to both parties by offering benefits to business and labor, and promising to serve national defense. Congress would eventually enact most of the program.[104] Kennedy, a Massachusetts Audubon Society supporter, wanted to make sure that the shorelines of Cape Cod remained unsullied by industrialization. On September 3, 1959, Kennedy co-sponsored the Cape Cod National Seashore bill with his Republican colleague Senator Leverett Saltonstall.[108][109]
As a senator, Kennedy quickly won a reputation for responsiveness to requests from constituents (i.e., co-sponsoring legislation to provide federal loans to help rebuild communities damaged by the 1953 Worcester tornado), except on certain occasions when the national interest was at stake.[110][111] In 1954, Kennedy voted in favor of the Saint Lawrence Seaway which would connect the Great Lakes to the Atlantic Ocean, despite opposition from Massachusetts politicians who argued that the project would hurt the Port of Boston economically.[112]
In 1954, when the Senate voted to condemn Joseph McCarthy for breaking Senate rules and abusing an Army general, Kennedy was the only Democrat not to cast a vote against him.[113] Kennedy drafted a speech supporting the censure. However, it was not delivered because Kennedy was hospitalized for back surgery in Boston.[114] Although Kennedy never indicated how he would have voted, the episode damaged his support among members of the liberal community in the 1956 and 1960 elections.[115]
In 1957, Kennedy joined the Senate's Select Committee on Labor Rackets (also known as the McClellan Committee) with his brother Robert, who was chief counsel, to investigate racketeering in labor-management relations.[119] The hearings attracted extensive radio and television coverage where the Kennedy brothers engaged in dramatic arguments with controversial labor leaders, including Jimmy Hoffa of the Teamsters Union. The following year, Kennedy introduced a bill to prevent the expenditure of union dues for improper purposes or private gain; to forbid loans from union funds for illicit transactions; and to compel audits of unions, which would ensure against false financial reports.[120] It was the first major labor relations bill to pass either house since the Taft–Hartley Act of 1947 and dealt largely with the control of union abuses exposed by the McClellan Committee but did not incorporate tough Taft–Hartley amendments requested by President Eisenhower. It survived Senate floor attempts to include Taft-Hartley amendments and passed but was rejected by the House.[121] "Honest union members and the general public can only regard it as a tragedy that politics has prevented the recommendations of the McClellan committee from being carried out this year," Kennedy announced.[122]
Kennedy cast a procedural vote against President Eisenhower's bill for the Civil Rights Act of 1957 and this was considered by some to be an appeasement of Southern Democratic opponents of the bill.[125] Kennedy did vote for Title III of the act, which would have given the Attorney General powers to enjoin, but Majority Leader Lyndon B. Johnson agreed to let the provision die as a compromise measure.[126] Kennedy also voted for the "Jury Trial Amendment." Many civil rights advocates criticized that vote as one which would weaken the act.[127] A final compromise bill, which Kennedy supported, was passed in September 1957.[128] As a senator from Massachusetts, which lacked a sizable Black population, Kennedy was not particularly sensitive to the problems of African Americans. Robert Kennedy later reflected, "We weren't thinking of the Negroes of Mississippi or Alabama—what should be done for them. We were thinking of what needed to be done in Massachusetts."[129]
Results of the 1958 U.S. Senate election in Massachusetts by municipality. Kennedy's margin of victory of 874,608 votes was the largest in Massachusetts political history.[130][131]
Most historians and political scientists who have written about Kennedy refer to his U.S. Senate years as an interlude.[132] According to Robert Dallek, Kennedy called being a senator "the most corrupting job in the world." He complained that they were all too quick to cut deals and please campaign contributors to ensure their political futures. Kennedy, with the luxury of a rich father who could finance his campaigns, could remain independent of any special interest, except for those in his home state of Massachusetts that could align against his reelection.[133] According to Robert Caro, Majority Leader Lyndon Johnson viewed Kennedy as a "playboy", describing his performance in the Senate and the House as "pathetic" on another occasion, saying that he was "smart enough, but he doesn't like the grunt work".[134] Author John T. Shaw acknowledges that while his Senate career is not associated with acts of "historic statesmanship" or "novel political thought," Kennedy made modest contributions as a legislator, drafting more than 300 bills to assist Massachusetts and the New England region (some of which became law).[135]
In 1958, Kennedy was re-elected to the Senate, defeating his Republican opponent, Boston lawyer Vincent J. Celeste, with 73.6 percent of the vote, the largest winning margin in the history of Massachusetts politics.[95] In the aftermath of his re-election, Kennedy began preparing to run for president by traveling throughout the U.S. with the aim of building his candidacy for 1960.[136][119]
On January 2, 1960, Kennedy announced his candidacy for the Democratic presidential nomination.[137] Though some questioned Kennedy's age and experience, his charisma and eloquence earned him numerous supporters. Kennedy faced several potential challengers, including Senate Majority Leader Lyndon Johnson, Adlai Stevenson II, and Senator Hubert Humphrey.[138]
Kennedy traveled extensively to build his support. His campaign strategy was to win several primaries to demonstrate his electability to the party bosses, who controlled most of the delegates, and to prove to his detractors that a Catholic could win popular support.[139] Victories over Senator Humphrey in the Wisconsin and West Virginia primaries gave Kennedy momentum as he moved on to the 1960 Democratic National Convention in Los Angeles.[138][140]
When Kennedy entered the convention, he had the most delegates, but not enough to ensure that he would win the nomination.[141] Stevenson—the 1952 and 1956 presidential nominee—remained very popular, while Johnson also hoped to win the nomination with support from party leaders. Kennedy's candidacy also faced opposition from former President Harry S. Truman, who was concerned about Kennedy's lack of experience. Kennedy knew that a second ballot could give the nomination to Johnson or someone else, and his well-organized campaign was able to earn the support of just enough delegates to win the presidential nomination on the first ballot.[142]
Kennedy ignored the opposition of his brother Robert, who wanted him to choose labor leader Walter Reuther,[143] and other liberal supporters when he chose Johnson as his vice-presidential nominee. He believed that the Texas senator could help him win support from the South.[144][145] In accepting the presidential nomination, Kennedy gave his well-known "New Frontier" speech:
For the problems are not all solved and the battles are not all won—and we stand today on the edge of a New Frontier. ... But the New Frontier of which I speak is not a set of promises—it is a set of challenges. It sums up not what I intend to offer the American people, but what I intend to ask of them.[146]
At the start of the fall general election campaign, the Republican nominee and incumbent Vice President Richard Nixon held a six-point lead in the polls.[147] Major issues included how to get the economy moving again, Kennedy's Catholicism, the Cuban Revolution, and whether the space and missile programs of the Soviet Union had surpassed those of the U.S. To address fears that his being Catholic would impact his decision-making, he told the Greater Houston Ministerial Association on September 12: "I am not the Catholic candidate for president. I am the Democratic Party candidate for president who also happens to be a Catholic. I do not speak for my Church on public matters—and the Church does not speak for me."[148] He promised to respect the separation of church and state, and not to allow Catholic officials to dictate public policy.[149][150]
Kennedy and Richard Nixon participate in the nation's second televised presidential debate, c. October 7, 1960.
The Kennedy and Nixon campaigns agreed to a series of televised debates.[151] An estimated 70 million Americans, about two-thirds of the electorate, watched the first debate on September 26.[152] Kennedy had met the day before with the producer to discuss the set design and camera placement. Nixon, just out of the hospital after a painful knee injury, did not take advantage of this opportunity and during the debate looked at the reporters asking questions and not at the camera. Kennedy wore a blue suit and shirt to cut down on glare and appeared sharply focused against the gray studio background. Nixon wore a light-colored suit that blended into the gray background; in combination with the harsh studio lighting that left Nixon perspiring, he offered a less-than-commanding presence. By contrast, Kennedy appeared relaxed, tanned, and telegenic, looking into the camera whilst answering questions.[153][151] It is often claimed that television viewers overwhelmingly believed Kennedy, appearing to be the more attractive of the two, had won, while radio listeners (a smaller audience) thought Nixon had defeated him.[152][154][155] However, only one poll split TV and radio voters like this and the methodology was poor.[156] Pollster Elmo Roper concluded that the debates raised interest, boosted turnout, and gave Kennedy an extra two million votes, mostly as a result of the first debate.[157] The debates are now considered a milestone in American political history—the point at which the medium of television began to play a dominant role.[97]
1960 presidential election results
Kennedy's campaign gained momentum after the first debate, and he pulled slightly ahead of Nixon in most polls. On Election Day, Kennedy defeated Nixon in one of the closest presidential elections of the 20th century. In the national popular vote, by most accounts, Kennedy led Nixon by just two-tenths of one percent (49.7% to 49.5%), while in the Electoral College, he won 303 votes to Nixon's 219 (269 were needed to win).[158] Fourteen electors from Mississippi and Alabama refused to support Kennedy because of his support for the civil rights movement; they voted for Senator Harry F. Byrd of Virginia, as did an elector from Oklahoma.[158] Forty-three years old, Kennedy was the youngest person ever elected to the presidency (though Theodore Roosevelt was a year younger when he succeeded to the presidency after the assassination of William McKinley in 1901).[159][160]
Kennedy was sworn in as the 35th president at noon on January 20, 1961. In his inaugural address, he spoke of the need for all Americans to be active citizens: "Ask not what your country can do for you—ask what you can do for your country." He asked the nations of the world to join to fight what he called the "common enemies of man: tyranny, poverty, disease, and war itself."[161] He added:
All this will not be finished in the first one hundred days. Nor will it be finished in the first one thousand days, nor in the life of this Administration, nor even perhaps in our lifetime on this planet. But let us begin." In closing, he expanded on his desire for greater internationalism: "Finally, whether you are citizens of America or citizens of the world, ask of us here the same high standards of strength and sacrifice which we ask of you.[161]
The address reflected Kennedy's confidence that his administration would chart a historically significant course in both domestic policy and foreign affairs. The contrast between this optimistic vision and the pressures of managing daily political realities would be one of the main tensions of the early years of his administration.[162]
Kennedy scrapped the decision-making structure of Eisenhower,[163] preferring an organizational structure of a wheel with all the spokes leading to the president; he was willing to make the increased number of quick decisions required in such an environment.[164] Though the cabinet remained important, Kennedy generally relied more on his staffers within the Executive Office.[165] In spite of concerns over nepotism, Kennedy's father insisted that Robert Kennedy become U.S. Attorney General, and the younger Kennedy became the "assistant president" who advised on all major issues.[166]
Kennedy's foreign policy was dominated by American confrontations with the Soviet Union, manifested by proxy contests in the global state of tension known as the Cold War. Like his predecessors, Kennedy adopted the policy of containment to stop the spread of communism.[167] Fearful of the possibility of nuclear war, Kennedy implemented a defense strategy known as flexible response. This strategy relied on multiple options for responding to the Soviet Union, discouraged massive retaliation, and encouraged mutual deterrence.[168][169] In contrast to Eisenhower's warning about the perils of the military-industrial complex, Kennedy focused on rearmament. From 1961 to 1964 the number of nuclear weapons increased by 50 percent, as did the number of B-52 bombers to deliver them.[170]
President Kennedy with Congolese Prime Minister Cyrille Adoula in 1962
Between 1960 and 1963, twenty-four countries gained independence as the process of decolonization continued. Kennedy set out to woo the leaders and people of the "Third World," expanding economic aid and appointing knowledgeable ambassadors.[173] His administration established the Food for Peace program and the Peace Corps to provide aid to developing countries. The Food for Peace program became a central element in American foreign policy, and eventually helped many countries to develop their economies and become commercial import customers.[174]
During the election campaign, Kennedy attacked the Eisenhower administration for losing ground on the African continent,[175] and stressed that the U.S. should be on the side of anti-colonialism and self-determination.[176] Kennedy considered the Congo Crisis to be among the most important foreign policy issues facing his presidency, and he supported a UN operation that prevented the secession of Katanga.[177]Moïse Tshombe, leader of Katanga, declared its independence from the Congo and the Soviet Union responded by sending weapons and technicians to underwrite their struggle.[176] On October 2, 1962, Kennedy signed United Nations bond issue bill to ensure U.S. assistance in financing UN peacekeeping operations in the Congo and elsewhere.[178]
Kennedy greets Peace Corps volunteers on August 28, 1961
In one of his first presidential acts, Kennedy signed Executive Order 10924 that officially started the Peace Corps. He named his brother-in-law, Sargent Shriver, as its first director.[179] Through this program, Americans volunteered to help developing countries in fields like education, farming, health care, and construction.[180] Kennedy believed that countries that received Peace Corps volunteers were less likely to succumb to a communist revolution.[181]Tanganyika (present-day Tanzania) and Ghana were the first countries to participate.[182] The organization grew to 5,000 members by March 1963 and 10,000 the year after.[183] Since 1961, over 200,000 Americans have joined the Peace Corps, representing 139 countries.[184][185]
Kennedy anxiously anticipated a summit with Nikita Khrushchev. The proceedings for the summit got off to a problematic start when Kennedy reacted aggressively to a routine Khrushchev speech on Cold War confrontation in early 1961. The speech was intended for domestic audiences in the Soviet Union, but Kennedy interpreted it as a personal challenge. His mistake helped raise tensions going into the Vienna summit.[186] The summit would cover several topics, but both leaders knew that the most contentious issue would be Berlin, which had been divided in two with the start of the Cold War. The enclave of West Berlin lay within Soviet-allied East Germany, but was supported by the U.S. and other Western powers. The Soviets wanted to reunify Berlin under the control of East Germany, partly due to the large number of East Germans who had fled to West Berlin.[187]
On June 4, 1961, Kennedy met with Khrushchev in Vienna and left the meeting angry and disappointed that he had allowed the premier to bully him, despite the warnings he had received. Khrushchev, for his part, was impressed with the president's intelligence but thought him weak. Kennedy did succeed in conveying the bottom line to Khrushchev on the most sensitive issue before them, a proposed treaty between Moscow and East Berlin. He made it clear that any treaty interfering with U.S. access rights in West Berlin would be regarded as an act of war.[188] Shortly after Kennedy returned home, the Soviet Union announced its plan to sign a treaty with East Berlin, abrogating any third-party occupation rights in either sector of the city. Kennedy assumed that his only option was to prepare the country for nuclear war, which he thought had a one-in-five chance of occurring.[189]
In the weeks immediately following the summit, more than 20,000 people fled from East Berlin to the western sector, reacting to statements from the Soviet Union. Kennedy began intensive meetings on the Berlin issue, where Dean Acheson took the lead in recommending a military buildup alongside NATO allies.[190] In a July 1961 speech, Kennedy announced his decision to add $3.25 billion (equivalent to $34.2 billion in 2024) to the defense budget, along with over 200,000 additional troops, stating that an attack on West Berlin would be taken as an attack on the U.S. The speech received an 85% approval rating.[191]
A month later, both the Soviet Union and East Berlin began blocking any further passage of East Germans into West Berlin and erected barbed-wire fences, which were quickly upgraded to the Berlin Wall. Kennedy acquiesced to the wall, though he sent Vice President Johnson to West Berlin to reaffirm U.S. commitment to the enclave's defense. In the following months, in a sign of rising Cold War tensions, both the U.S. and the Soviet Union ended a moratorium on nuclear weapon testing.[192] A brief stand-off between U.S. and Soviet tanks occurred at Checkpoint Charlie in October following a dispute over free movement of Allied personnel. The crisis was defused largely through a backchannel communication the Kennedy administration had set up with Soviet spy Georgi Bolshakov.[193] In remarks to his aides on the Berlin Wall, Kennedy noted that "it's not a very nice solution, but a wall is a hell of a lot better than a war."[194]
The Eisenhower administration had created a plan to overthrow Fidel Castro's regime though an invasion of Cuba by a counter-revolutionary insurgency composed of U.S.-trained, anti-Castro Cuban exiles[195][196] led by CIA paramilitary officers.[197] Kennedy had campaigned on a hardline stance against Castro, and when presented with the plan that had been developed under the Eisenhower administration, he enthusiastically adopted it regardless of the risk of inflaming tensions with the Soviet Union.[198] Kennedy approved the final invasion plan on April 4, 1961.[199]
On April 15, 1961, eight CIA-supplied B-26 bombers left Nicaragua to bomb Cuban airfields. The bombers missed many of their targets, leaving most of Castro's air force intact.[200] On April 17, the 1,500 U.S.-trained Cuban exile invasion force, known as Brigade 2506, landed at beaches along the Bay of Pigs and immediately came under heavy fire.[201] The goal was to spark a widespread popular uprising against Castro, but no such uprising occurred.[202] No U.S. air support was provided.[203] The invading force was defeated within two days by the Cuban Revolutionary Armed Forces;[204] 114 were killed and Kennedy was forced to negotiate for the release of the 1,189 survivors.[205] After twenty months, Cuba released the captured exiles in exchange for a ransom of $53 million worth of food and medicine.[206] The incident made Castro wary of the U.S. and led him to believe that another invasion would take place.[207]
Biographer Richard Reeves said that Kennedy focused primarily on the political repercussions of the plan rather than military considerations. When it proved unsuccessful, he was convinced that the plan was a setup to make him look bad.[208] He took responsibility for the failure, saying, "We got a big kick in the leg and we deserved it. But maybe we'll learn something from it."[209] Kennedy's approval ratings climbed afterwards, helped in part by the vocal support given to him by Nixon and Eisenhower.[210] He appointed Robert Kennedy to help lead a committee to examine the causes of the failure.[211] The Kennedy administration banned all Cuban imports and convinced the Organization of American States (OAS) to expel Cuba.[212]
In late 1961, the White House formed the Special Group (Augmented), headed by Robert Kennedy and including Edward Lansdale, Secretary Robert McNamara, and others. The group's objective—to overthrow Castro via espionage, sabotage, and other covert tactics—was never pursued.[213] In November 1961, he authorized Operation Mongoose.[214] In March 1962, Kennedy rejected Operation Northwoods, proposals for false flag attacks against American military and civilian targets,[215] and blaming them on the Cuban government to gain approval for a war against Cuba. However, the administration continued to plan for an invasion of Cuba in the summer of 1962.[214]
In the aftermath of the Bay of Pigs invasion, Khrushchev increased economic and military assistance to Cuba.[216] The Soviet Union planned to allocate in Cuba 49 medium-range ballistic missiles, 32 intermediate-range ballistic missiles, 49 light Il-28 bombers and about 100 tactical nuclear weapons.[217] The Kennedy administration viewed the growing Cuba-Soviet alliance with alarm, fearing that it could eventually pose a threat to the U.S.[218] On October 14, 1962, CIA U-2 spy planes took photographs of the Soviets' construction of intermediate-range ballistic missile sites in Cuba. The photos were shown to Kennedy on October 16; a consensus was reached that the missiles were offensive in nature and posed an immediate nuclear threat.[219]
Kennedy faced a dilemma: if the U.S. attacked the sites, it might lead to nuclear war with the Soviet Union, but if the U.S. did nothing, it would be faced with the increased threat from close-range nuclear weapons (positioned approximately 90 mi (140 km) away from the Florida coast).[220] The U.S. would also appear to the world as less committed to the defense of the Western Hemisphere. On a personal level, Kennedy needed to show resolve in reaction to Khrushchev, especially after the Vienna summit.[221] To deal with the crisis, he formed an ad-hoc body of key advisers, later known as EXCOMM, that met secretly between October 16 and 28.[222]
More than a third of U.S. National Security Council (NSC) members favored an unannounced air assault on the missile sites, but some saw this as "Pearl Harbor in reverse."[223] There was some concern from the international community (asked in confidence) that the assault plan was an overreaction given that Eisenhower had placed PGM-19 Jupiter missiles in Italy and Turkey in 1958. It also could not be assured that the assault would be 100% effective.[224] In concurrence with a majority vote of the NSC, Kennedy decided on a naval blockade (or "quarantine"). On October 22, after privately informing the cabinet and leading members of Congress about the situation, Kennedy announced the naval blockade on national television and warned that U.S. forces would seize "offensive weapons and associated materiel" that Soviet vessels might attempt to deliver to Cuba.[225]
Kennedy confers with Attorney General Robert Kennedy; c. October 1962.
The U.S. Navy would stop and inspect all Soviet ships arriving off Cuba, beginning October 24. Several Soviet ships approached the blockade line, but they stopped or reversed course.[226] The OAS gave unanimous support to the removal of the missiles. Kennedy exchanged two sets of letters with Khrushchev, to no avail.[227] UN Secretary General U Thant requested both parties to reverse their decisions and enter a cooling-off period. Khrushchev agreed, but Kennedy did not.[228] Kennedy managed to preserve restraint when a Soviet missile unauthorizedly downed a U.S. Lockheed U-2 reconnaissance aircraft over Cuba, killing pilot Rudolf Anderson.[229]
At the president's direction, Robert Kennedy privately informed Soviet Ambassador Anatoly Dobrynin that the U.S. would remove the Jupiter missiles from Turkey "within a short time after this crisis was over."[230] On October 28, Khrushchev agreed to dismantle the missile sites, subject to UN inspections.[231] The U.S. publicly promised never to invade Cuba and privately agreed to remove its Jupiter missiles from Italy and Turkey, which were by then obsolete and had been supplanted by submarines equipped with UGM-27 Polaris missiles.[232]
In the aftermath, a Moscow–Washington hotline was established to ensure clear communications between the leaders of the two countries.[233] This crisis brought the world closer to nuclear war than at any point before or after, but "the humanity" of Khrushchev and Kennedy prevailed.[234] The crisis improved the image of American willpower and the president's credibility. Kennedy's approval rating increased from 66% to 77% immediately thereafter.[235]
Believing that "those who make peaceful revolution impossible, will make violent revolution inevitable,"[236][237] Kennedy sought to contain the perceived threat of communism in Latin America by establishing the Alliance for Progress, which sent aid to some countries and sought greater human rights standards in the region.[238] In response to Kennedy's plea, Congress voted for an initial grant of $500 million in May 1961.[239] The Alliance for Progress supported the construction of housing, schools, airports, hospitals, clinics and water-purification projects as well as the distribution of free textbooks to students.[240] However, the program did not meet many of its goals. Massive land reform was not achieved; populations more than kept pace with gains in health and welfare; and according to one study, only 2 percent of economic growth in 1960s Latin America directly benefited the poor.[241][242] U.S. presidents after Kennedy were less supportive of the program and by 1973, the permanent committee established to implement the Alliance was disbanded by the OAS.[240]
The Eisenhower administration, through the CIA, had begun formulating plans to assassinate Castro in Cuba and Rafael Trujillo in the Dominican Republic. When Kennedy took office, he privately instructed the CIA that any plan must include plausible deniability by the U.S. His public position was in opposition.[243] In June 1961, the Dominican Republic's leader was assassinated; in the days following, Undersecretary of State Chester Bowles led a cautious reaction by the nation. Robert Kennedy, who saw an opportunity for the U.S., called Bowles "a gutless bastard" to his face.[244]
After the election, Eisenhower emphasized to Kennedy that the communist threat in Southeast Asia required priority; Eisenhower considered Laos to be "the cork in the bottle" in regards to the regional threat.[245] In March 1961, Kennedy voiced a change in policy from supporting a "free" Laos to a "neutral" Laos, indicating privately that Vietnam should be deemed America's tripwire for communism's spread in the area.[245] Though he was unwilling to commit U.S. forces to a major military intervention in Laos, Kennedy did approve CIA activities designed to defeat Communist insurgents through bombing raids and the recruitment of the Hmong people.[246]
Kennedy speaking in a televised press conference on the situation in Southeast Asia, c. March 23, 1961Walter Cronkite of CBS News interviewing Kennedy on September 2, 1963, about U.S. involvement in Vietnam
During his presidency, Kennedy continued policies that provided political, economic, and military support to the South Vietnamese government.[247] Vietnam had been divided into a communist North Vietnam and a non-communist South Vietnam after the 1954 Geneva Conference, but Kennedy escalated American involvement in Vietnam in 1961 by financing the South Vietnam army, increasing the number of U.S. military advisors above the levels of the Eisenhower administration, and authorizing U.S. helicopter units to provide support to South Vietnamese forces.[248] On January 18, 1962, Kennedy formally authorized escalated involvement when he signed the National Security Action Memorandum (NSAM) – "Subversive Insurgency (War of Liberation)."[249]Operation Ranch Hand, a large-scale aerial defoliation effort using the herbicide Agent Orange, began on the roadsides of South Vietnam to combat guerrilla defendants.[250][251]
Though Kennedy provided support for South Vietnam throughout his tenure, Vietnam remained a secondary issue for the Kennedy administration until 1963.[252] On September 2, Kennedy declared in an interview with Walter Cronkite of CBS:
In the final analysis, it is their war. They are the ones who have to win it or lose it. We can help them, we can give them equipment, we can send our men out there as advisers, but they have to win it, the people of Vietnam, against the Communists... But I don't agree with those who say we should withdraw. That would be a great mistake... [The United States] made this effort to defend Europe. Now Europe is quite secure. We also have to participate—we may not like it—in the defense of Asia.[253][254]
Kennedy increasingly soured on the president of South Vietnam, Ngo Dinh Diem, whose violent crackdown on Buddhist practices galvanized opposition to his leadership. In August 1963, Henry Cabot Lodge Jr. replaced Frederick Nolting as the U.S. ambassador to South Vietnam. Days after his arrival in South Vietnam, Lodge reported that several South Vietnamese generals sought the assent of the U.S. government to their plan of removing Diem from power. The Kennedy administration was split regarding not just the removal of Diem, but also their assessment of the military situation and the proper U.S. role in the country. After the State Department sent a diplomatic cable to Lodge that ordered him to pressure Diem to remove military authority from his brother, Ngô Đình Nhu, or face potential withdrawal of U.S. support and removal from power,[255] Kennedy instructed Lodge to offer covert assistance to a coup d'état, excluding assassination.[256] On November 1, 1963, a junta of senior military officers executed the coup which led to the arrest and assassinations of Diem and Nhu on November 2.[257]
By November 1963, there were 16,000 American military personnel in South Vietnam, up from Eisenhower's 900 advisors;[258] more than one hundred Americans had been killed in action and no final policy decision was made.[259][260][261] In the aftermath of the aborted coup in September 1963, the Kennedy administration reevaluated its policies in South Vietnam. Kennedy rejected the full-scale deployment of ground soldiers but also the total withdrawal of U.S. forces.[262] Historians disagree on whether the U.S. military presence in Vietnam would have escalated had Kennedy survived and been re-elected in 1964.[263] Fueling the debate are statements made by Secretary of Defense McNamara in the 2003 documentary film The Fog of War that Kennedy was strongly considering pulling out of Vietnam after the 1964 election,[264] and comments made by Kennedy administration White House Counsel and speechwriter Ted Sorensen in a 2008 memoir suggesting that Kennedy was undecided about what policy direction to take.[265][261]
On October 11, 1963, Kennedy signed NSAM 263 ordering the withdrawal of 1,000 military personnel by the end of the year following the third recommendation of the McNamara–Taylor mission report, which concluded that the training program for the South Vietnamese military had sufficiently progressed to justify the withdrawal.[266][267][268] However, NSAM 263 also approved the first recommendation of the report to continue providing support to South Vietnam to prevent the spread of communism and until the Viet Cong was suppressed, while the third recommendation suggested that even if the majority of the U.S. military objective was completed by the end of 1965 that continued presence of U.S. training personnel in more limited numbers could be necessary if the insurgency was not suppressed.[269][270][268]
In 1963, Germany was enduring a time of particular vulnerability due to Soviet aggression to the east as well as the impending retirement of West German Chancellor Adenauer.[271] At the same time, French President Charles de Gaulle was trying to build a Franco-West German counterweight to the American and Soviet spheres of influence.[272][273][274] To Kennedy's eyes, this Franco-German cooperation seemed directed against NATO's influence in Europe.[275]
To reinforce the U.S. alliance with West Germany, Kennedy travelled to West Germany and West Berlin in June 1963. On June 26, Kennedy toured West Berlin, culminating in a public speech at the city hall in front of hundreds of thousands of enthusiastic Berliners.[276] He reiterated the American commitment to Germany and criticized communism and was met with an ecstatic response from the massive audience.[277] Kennedy used the construction of the Berlin Wall as an example of the failures of communism: "Freedom has many difficulties, and democracy is not perfect. But we have never had to put a wall up to keep our people in, to prevent them from leaving us." The speech is known for its famous phrase "Ich bin ein Berliner" ("I am a Berliner").[278]
Kennedy ended the arms embargo that the Truman and Eisenhower administrations had enforced on Israel in favor of increased security ties, becoming the founder of the U.S.-Israeli military alliance. Describing the protection of Israel as a moral and national commitment, he was the first to introduce the concept of a 'special relationship' between the U.S. and Israel.[279] In 1962, the Kennedy administration sold Israel a major weapon system, the Hawk antiaircraft missile. Historians differ as to whether Kennedy pursued security ties with Israel primarily to shore up support with Jewish-American voters, or because of his admiration of the Jewish state.[280]
In December 1961, Abd al-Karim Qasim's Iraqi government passed Public Law 80, which restricted the partially American-controlled Iraq Petroleum Company (IPC)'s concessionary holding to those areas in which oil was actually being produced (namely, the fields at Az Zubair and Kirkuk), effectively expropriating 99.5% of the IPC concession. British and U.S. officials demanded that the Kennedy administration place pressure on the Qasim regime.[281] In April 1962, the State Department issued new guidelines on Iraq that were intended to increase American influence. Meanwhile, Kennedy instructed the CIA—under the direction of Archibald Bulloch Roosevelt Jr.—to begin making preparations for a military coup against Qasim.[282]
The anti-imperialist and anti-communist Iraqi Ba'ath Party overthrew and executed Qasim in a violent coup on February 8, 1963. Despite persistent rumors that the CIA orchestrated the coup, declassified documents and the testimony of former CIA officers indicate that there was no direct American involvement.[283] The Kennedy administration was pleased with the outcome and ultimately approved a $55-million arms deal for Iraq.[284]
Kennedy's motorcade through Cork, Ireland on June 28, 1963
During his four-day visit to his ancestral home of Ireland beginning on June 26, 1963,[285] Kennedy accepted a grant of armorial bearings from the Chief Herald of Ireland, received honorary degrees from the National University of Ireland and Trinity College Dublin, attended a State Dinner in Dublin, and was conferred with the freedom of the towns and cities of Wexford, Cork, Dublin, Galway, and Limerick.[286][287] He visited the cottage at Dunganstown, near New Ross, County Wexford, where his ancestors had lived before emigrating to America.[288]
Kennedy was the first foreign leader to address the Houses of the Oireachtas, the Irish parliament.[287][289][290] Kennedy later told aides that the trip was the best four days of his life.[291]
On June 10, 1963, Kennedy, at the high point of his rhetorical powers,[292] delivered the commencement address at American University. Also known as "A Strategy of Peace", not only did Kennedy outline a plan to curb nuclear arms, but he also "laid out a hopeful, yet realistic route for world peace at a time when the U.S. and Soviet Union faced the potential for an escalating nuclear arms race."[293] Kennedy also announced that the Soviets had expressed a desire to negotiate a nuclear test ban treaty, and that the U.S. had postponed planned atmospheric tests.[294]
Troubled by the long-term dangers of radioactive contamination and nuclear proliferation, Kennedy and Khrushchev agreed to negotiate a nuclear test ban treaty, originally conceived in Adlai Stevenson's 1956 presidential campaign.[295] In their Vienna summit meeting in June 1961, Khrushchev and Kennedy reached an informal understanding against nuclear testing, but the Soviet Union began testing nuclear weapons that September. In response, the United States conducted tests five days later.[296] Shortly afterwards, new U.S. satellites began delivering images that made it clear that the Soviets were substantially behind the U.S. in the arms race.[297] Nevertheless, the greater nuclear strength of the U.S. was of little value as long as the Soviet Union perceived itself to be at parity.[298]
In July 1963, Kennedy sent W. Averell Harriman to Moscow to negotiate a treaty with the Soviets.[299] The introductory sessions included Khrushchev, who later delegated Soviet representation to Andrei Gromyko. It quickly became clear that a comprehensive test ban would not be implemented, due largely to the reluctance of the Soviets to allow inspections to verify compliance.[300]
Ultimately, the United States, the United Kingdom, and the Soviet Union were the initial signatories to a limited treaty, which prohibited atomic testing on the ground, in the atmosphere, or underwater, but not underground. The U.S. Senate approved the treaty on September 23, 1963, and Kennedy signed it on October 7, 1963.[301] France was quick to declare that it was free to continue developing and testing its nuclear defenses.[302]
Kennedy called his domestic proposals the "New Frontier".[303] However, Kennedy's small margin of victory in the 1960 election, his lack of deep connections to influential members of Congress, and his administration's focus on foreign policy hindered the passage of New Frontier policies.[304]
In 1961, Kennedy prioritized passing five bills: federal assistance for education, medical insurance for the elderly, housing legislation, federal aid to struggling areas, and an increase in the federal minimum wage.[305] Kennedy's bill to increase the federal minimum wage to $1.25 an hour passed in early 1961, but an amendment inserted by conservative leader from Georgia, Carl Vinson, exempted laundry workers from the law.[306] Kennedy also won passage of the Area Redevelopment Act and the Housing Act of 1961. The Area Redevelopment Act, a $394 million program, provided federal funding to economically struggling regions (primarily in Appalachia), while the Housing Act of 1961 provided funds for urban renewal and public housing and authorized federal mortgage loans to those who did not qualify for public housing.[307] Kennedy proposed a bill providing for $2.3 billion in federal educational aid to the states, with more money going to states with lower per capita income. Though the Senate passed the education bill, it was defeated in the House by a coalition of Republicans, Southern Democrats, and Catholics.[308] Kennedy's health insurance bill, which would have paid for hospitalization and nursing costs for the elderly, failed to pass either house of Congress.[309] A bill that would have established the Department of Urban Affairs and Housing was also defeated.[310]
Trade policy included both domestic and foreign policy. The 1962 Trade Expansion Act passed Congress by wide majorities. It authorized the president to negotiate tariff reductions on a reciprocal basis of up to 50 percent with the European Common Market.[312] The legislation paved the way for the Kennedy Round of General Agreement on Tariffs and Trade negotiations, concluding on June 30, 1967, the last day before expiration of the Act.[313]
Walter Heller, who served as the chairman of the CEA, advocated for a Keynesian-style tax cut designed to help spur economic growth, and Kennedy adopted this policy.[314] The idea was that a tax cut would stimulate consumer demand, which in turn would lead to higher economic growth, lower unemployment, and increased federal revenues.[315] To the disappointment of liberals like John Kenneth Galbraith, Kennedy's embrace of the tax cut shifted his administration's focus away from the proposed old-age health insurance program and other domestic expenditures.[316] In January 1963, Kennedy proposed a tax cut that would reduce the top marginal tax rate from 91 to 65 percent, and lower the corporate tax rate from 52 to 47 percent. The predictions according to the Keynesian model indicated the cuts would decrease income taxes by about $10 billion and corporate taxes by about $3.5 billion. The plan included reforms designed to reduce the impact of itemized deductions, as well as provisions to help the elderly and handicapped. Republicans and many Southern Democrats opposed the bill, calling for simultaneous reductions in expenditures, but debate continued throughout 1963.[317] Three months after Kennedy died, Johnson pushed the plan through Congress. The Revenue Act of 1964 lowered the top individual rate to 70 percent, and the top corporate rate to 48 percent.[318]
President Kennedy delivers his State of the Union Address; c. January 14, 1963.
Kennedy ended a period of tight fiscal policies, loosening monetary policy to keep interest rates down and to encourage growth of the economy.[319] He presided over the first government budget to top the $100 billion mark, in 1962, and his first budget in 1961 resulted in the nation's first non-war, non-recession deficit.[320] The economy, which had been through two recessions in three years and was in one when Kennedy took office, accelerated notably throughout his administration. Despite low inflation and interest rates, the GDP had grown by an average of only 2.2% per annum during the Eisenhower administration (scarcely more than population growth at the time), and it had declined by 1% during Eisenhower's last twelve months in office.[321]
The economy turned around and prospered during Kennedy's presidency. The GDP expanded by an average of 5.5% from early 1961 to late 1963,[321] while inflation remained steady at around 1% and unemployment eased.[322] Industrial production rose by 15% and motor vehicle sales increased by 40%.[323] This sustained rate of growth in GDP and industry continued until around 1969.[321]
Kennedy was proud that his Labor Department helped keep wages steady in the steel industry, but was outraged in April 1962 when Roger Blough, the president of U.S. Steel, quietly informed Kennedy that his company would raise prices.[324] In response, Attorney General Robert Kennedy began a price-fixing investigation against U.S. Steel, and President Kennedy convinced other steel companies to rescind their price increases until finally even U.S. Steel, isolated and in danger of being undersold, agreed to rescind its own price increase.[325] An editorial in The New York Times praised Kennedy's actions and stated that the steel industry's price increase "imperil[ed] the economic welfare of the country by inviting a tidal wave of inflation."[326] Nevertheless, the administration's Bureau of Budget reported the price increase would have caused a net gain for the GDP as well as a net budget surplus.[327] The stock market, which had steadily declined since Kennedy's election in 1960, dropped 10% shortly after the administration's action on the steel industry took place.[328]
Kennedy verbally supported civil rights during his 1960 presidential campaign; he telephoned Coretta Scott King, wife of Martin Luther King Jr., who had been jailed while trying to integrate a department store lunch counter. Robert Kennedy called Georgia Governor Ernest Vandiver and obtained King's release from prison, which drew additional Black support to his brother's candidacy.[329] Recognizing that conservative Southern Democrats could block legislation, Kennedy did not introduce civil rights legislation on taking office.[330] He needed their support to pass his economic and foreign policy agendas, and to support his reelection in 1964.[331] Kennedy did appoint many Blacks to office, including civil rights attorney Thurgood Marshall to the U.S. Court of Appeals.[332]
Kennedy believed the grassroots movement for civil rights would anger many Southern Whites and make it more difficult to pass civil rights laws in Congress, and he distanced himself from it.[333] As articulated by Robert Kennedy, the administration's early priority was to "keep the president out of this civil rights mess."[334] Civil rights movement participants, mainly those on the front line in the South, viewed Kennedy as lukewarm,[332] especially concerning the Freedom Riders. In May 1961, the Congress of Racial Equality, led by James Farmer, organized integrated Freedom Rides to test a Supreme Court case ruling that declared segregation on interstate transportation illegal.[335] The Riders were repeatedly met with mob violence, including by federal and state law enforcement officers.[332] Kennedy assigned federal marshals to protect the Riders rather than using federal troops or uncooperative FBI agents.[332] Kennedy feared sending federal troops would stir up "hated memories of Reconstruction" among conservative Southern whites.[332] The Justice Department then petitioned the Interstate Commerce Commission (ICC) to adhere to federal law. By September 1961, the ICC ruled in favor of the petition.[336]
On March 6, 1961, Kennedy signed Executive Order 10925, which required government contractors to "take affirmative action to ensure that applicants are employed and that employees are treated during employment without regard to their race, creed, color, or national origin."[337] It established the President's Committee on Equal Employment Opportunity.[338]
In September 1962, James Meredith enrolled at the all-White University of Mississippi but was prevented from entering. In response, Attorney General Robert Kennedy sent 400 federal marshals.[339] The Ole Miss riot of 1962 left two dead and dozens injured, prompting Kennedy to send in 3,000 troops to quell the riot.[340] Meredith did finally enroll in class. Kennedy regretted not sending in troops earlier and he began to doubt whether the "evils of Reconstruction" he had been taught or believed were true.[332] On November 20, 1962, Kennedy signed Executive Order 11063, which prohibited racial discrimination in federally supported housing.[341]
On June 11, 1963, Kennedy intervened when Alabama Governor George Wallace blocked the doorway to the University of Alabama to stop two Black students, Vivian Malone and James Hood, from attending. Wallace moved aside only after being confronted by Deputy Attorney General Nicholas Katzenbach and the Alabama National Guard, which had just been federalized by order of the president. That evening Kennedy gave his famous Report to the American People on Civil Rights speech on national television and radio, launching his initiative for civil rights legislation—to provide equal access to public schools and other facilities, and greater protection of voting rights.[342]
His proposals became part of the Civil Rights Act of 1964. The day ended with the murder of an NAACP leader, Medgar Evers, in Mississippi.[343] As Kennedy had predicted, the day after his TV speech, and in reaction to it, House Majority leader Carl Albert called to advise him that his two-year signature effort in Congress to combat poverty in Appalachia had been defeated, primarily by the votes of Southern Democrats and Republicans.[344] When Arthur Schlesinger Jr. complimented Kennedy on his remarks, Kennedy bitterly replied, "Yes, and look at what happened to area development the very next day in the House." He then added, "But of course, I had to give that speech, and I'm glad that I did."[345] On June 16, The New York Times published an editorial which argued that while Kennedy had initially "moved too slowly and with little evidence of deep moral commitment" in regards to civil rights he "now demonstrate[d] a genuine sense of urgency about eradicating racial discrimination from our national life."[346]
Kennedy meetings with leaders of the March on Washington in the Oval Office, c. August 28, 1963
A crowd of over 250,000, predominantly African Americans, gathered in Washington for the civil rights March on Washington for Jobs and Freedom on August 28, 1963. Kennedy initially opposed the march, fearing it would have a negative effect on the prospects for the civil rights bills pending in Congress. These fears were heightened just prior to the march when FBI Director J. Edgar Hoover presented Kennedy with reports that some of King's close advisers, specifically Jack O'Dell and Stanley Levison, were communists.[347] When King ignored the administration's warning, Robert Kennedy authorized the FBI to wiretap King and other leaders of the Southern Christian Leadership Conference.[348] Although Kennedy only gave written approval for limited wiretapping of King's phones "on a trial basis, for a month or so,"[349] Hoover extended the clearance so his men were "unshackled" to look for evidence in any areas of King's life they deemed worthy.[350]
The Department of Justice was assigned to coordinate the federal government's involvement in the March on Washington on August 28; several hundred thousand dollars to were channeled to the six sponsors of the March.[351] To ensure a peaceful demonstration, the organizers and the president personally edited speeches that were inflammatory and collaborated on all aspects related to times and venues. Thousands of troops were placed on standby. Kennedy watched King's speech on TV and was very impressed. The March was considered a "triumph of managed protest," and not one arrest relating to the demonstration occurred. Afterwards, the March leaders accepted an invitation to the White House to meet with Kennedy and photos were taken. Kennedy felt that the March was a victory for him as well and bolstered the chances for his civil rights bill.[351]
Three weeks later on Sunday, September 15, a bomb exploded at the 16th Street Baptist Church in Birmingham; by the end of the day, four Black children had died in the explosion, and two others were shot to death in the aftermath.[352] Due to this resurgent violence, the civil rights legislation underwent some drastic amendments that critically endangered any prospects for passage of the bill, to the outrage of Kennedy. He called the congressional leaders to the White House and by the following day the original bill, without the additions, had enough votes to get it out of the House committee.[353] Gaining Republican support, Senator Everett Dirksen promised the legislation would be brought to a vote preventing a Senate filibuster.[354] On July 2, 1964, the guarantees Kennedy proposed in his June 1963 speech became federal law, when President Johnson signed the Civil Rights Act.[354]
During the 1960 presidential campaign, Kennedy endorsed the concept of equal pay for equal work.[355] In December 1961, Kennedy signed an executive order creating the Presidential Commission on the Status of Women to advise him on issues concerning the status of women.[356] Former First Lady Eleanor Roosevelt led the commission. The commission's final report was issued in October 1963; it documented the legal and cultural discrimination women in America faced and made several policy recommendations to bring about change.[357] On June 10, 1963, Kennedy signed the Equal Pay Act of 1963, which amended the Fair Labor Standards Act and abolished wage disparity based on sex.[358]
Under the leadership of the attorney general, the Kennedy administration shifted the focus of the Justice Department, the FBI, and the IRS to organized crime. Kennedy won congressional approval for five bills (i.e., Federal Wire Act of 1961) designed to crack down on interstate racketeering, gambling, and the transportation of firearms.[359][360]
On March 22, 1962, Kennedy signed into law a bill abolishing the mandatory death penalty for first degree murder in the District of Columbia, the only remaining jurisdiction in the United States with such a penalty.[361] The death penalty has not been applied in D.C. since 1957 and has now been abolished.[362]
Kennedy had relatively little interest in agricultural issues, but he sought to remedy the issue of overproduction, boost the income of farmers, and lower federal expenditures on agriculture. Under the direction of Secretary of Agriculture Orville Freeman, the administration sought to limit the production of farmers, but these proposals were generally defeated in Congress. To increase demand for domestic agricultural products and help the impoverished, Kennedy launched a pilot Food Stamp program and expanded the federal school lunch program.[363]
Construction of the Kinzua Dam flooded 10,000 acres (4,000 hectares) of Seneca nation land that they had occupied under the Treaty of 1794, and forced 600 Seneca to relocate to Salamanca, New York. Kennedy was asked by the American Civil Liberties Union to halt the project, but he declined, citing a critical need for flood control. He expressed concern about the plight of the Seneca and directed government agencies to assist in obtaining more land, damages, and assistance to mitigate their displacement.[364][365]
In the aftermath of the Soviet launch of Sputnik 1, the first artificial Earth satellite, NASA proposed a crewed lunar landing by the early 1970s.[366] Funding for the program, known as the Apollo program, was far from certain as Eisenhower held an ambivalent attitude.[367] Early in his presidency, Kennedy was poised to dismantle the crewed space program, but he postponed any decision out of deference to Vice President Johnson, who had been a strong supporter of the program in the Senate.[368] With Jerome Wiesner, Johnson was given a major role in overseeing the administration's space policy, and at Johnson's recommendation Kennedy appointed James E. Webb to head NASA.[369]
In Kennedy's State of the Union address in 1961, he suggested international cooperation in space. Khrushchev declined, as the Soviets did not wish to reveal the status of their rocketry and space capabilities.[370] In April 1961, Soviet cosmonaut Yuri Gagarin became the first person to fly in space, reinforcing American fears about being left behind by the Soviet Union.[371] Less than a month later, Alan Shepard became the first American to travel into space, strengthening Kennedy's confidence in NASA.[372] The following year, John Glenn, aboard the Mercury craft Friendship 7, became the first American to orbit the Earth.[373]
In the aftermath of Gagarin's flight, as well as the failed Bay of Pigs invasion, Kennedy felt pressured to respond to the perceived erosion of American prestige. He asked Johnson to explore the feasibility of beating the Soviets to the Moon. Though he was concerned about the program's costs, Kennedy agreed to Johnson's recommendation that the U.S. commit to a crewed lunar landing as the major objective of the space program. In a May 25 speech to Congress, Kennedy declared,[372]
... I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the Moon and returning him safely to the Earth. No single space project in this period will be more impressive to mankind, or more important for the long-range exploration of space; and none will be so difficult or expensive to accomplish.[374]Full text
Though Gallup polling showed that many in the public were skeptical of the necessity of the Apollo program,[375] members of Congress were strongly supportive in 1961 and approved a major increase in NASA's funding. Webb began reorganizing NASA, increasing its staffing level, and building two new centers: a Launch Operations Center for the large Moon rocket northwest of Cape Canaveral Air Force Station, and a Manned Spacecraft Center in Houston. Kennedy took the latter occasion as an opportunity to deliver another speech promoting the space effort on September 12, 1962, in which he said:
No nation which expects to be the leader of other nations can expect to stay behind in this race for space. ... We choose to go to the Moon in this decade and do the other things, not because they are easy, but because they are hard.[376]Full text
On November 21, 1962, in a cabinet meeting with Webb and other officials, Kennedy explained that the Moon shot was important for reasons of international prestige, and that the expense was justified.[377] On July 20, 1969, almost six years after Kennedy's death, Apollo 11 landed the first crewed spacecraft on the Moon.[378]
The Kennedys and the Connallys in the presidential limousine moments before the assassination in Dallas
Kennedy was taken to Parkland Hospital, where he was pronounced dead 30 minutes later, at 1:00 p.m.[385] He was 46 years old. Lee Harvey Oswald was arrested for the murder of police officer J. D. Tippit and was subsequently charged with Kennedy's assassination.[386] He denied shooting anyone, claiming he was a patsy,[387][388] and was shot dead by Jack Ruby on November 24, before he could be prosecuted.[385] Ruby was arrested and convicted for the murder of Oswald. Ruby successfully appealed his conviction but died of cancer on January 3, 1967, while the date for his new trial was being set.[389]
President Johnson quickly issued an executive order to create the Warren Commission—chaired by Chief Justice Earl Warren—to investigate the assassination. The commission concluded that Oswald acted alone in killing Kennedy and that Oswald was not part of any conspiracy.[390][391] These conclusions are disputed by many.[392] A Gallup Poll in November 2013 showed 61% believed in a conspiracy, and only 30% thought that Oswald did it alone.[393] In 1979, the U.S. House Select Committee on Assassinations concluded, with one third of the committee dissenting, "that Kennedy was probably assassinated as a result of a conspiracy." The committee was unable to identify the other gunmen or the extent of the conspiracy. This conclusion was based largely on audio recordings of the shooting.[394] Subsequently, investigative reports from the FBI and a specially appointed National Academy of Sciences Committee determined that "reliable acoustic data do not support a conclusion that there was a second gunman."[395] The Justice Department concluded "that no persuasive evidence can be identified to support the theory of a conspiracy".[396]
Kennedy's body was brought back to Washington. On November 23, six military pallbearers carried the flag-draped coffin into the East Room of the White House, where he lay in repose for 24 hours.[397][398] Then, the coffin was carried on a horse-drawn caisson to the Capitol to lie in state. Throughout the day and night, hundreds of thousands lined up to view the guarded casket,[399][400] with a quarter million passing through the rotunda during the 18 hours of lying in state.[399]
The Kennedy brothers: Attorney General Robert F. Kennedy, Senator Ted Kennedy, and President John F. Kennedy in 1963
The Kennedy family is one of the most established political families in the United States, having produced a president, three senators, three ambassadors, and multiple other representatives and politicians. While a congressman, Kennedy embarked on a seven-week trip to India, Japan, Vietnam, and Israel in 1951, at which point he became close with his then 25-year-old brother Robert, as well as his 27-year-old sister Patricia. Because they were several years apart in age, the brothers had previously seen little of each other. This 25,000-mile (40,000 km) trip was the first extended time they had spent together and resulted in their becoming best friends.[405] Robert would eventually serve as his brother's attorney general and closest presidential advisor;[405] he would later run for president in 1968 before his assassination, while another Kennedy brother, Ted, ran for president in 1980.[406]
After a miscarriage in 1955 and a stillbirth in 1956 (their daughter Arabella), their daughter Caroline was born in 1957. John Jr., nicknamed "John-John" by the press as a child, was born in late November 1960, 17 days after his father was elected. John Jr. died in 1999 when the small plane he was piloting crashed.[416] In August 1963, Jackie gave birth to a son, Patrick. However, he died after two days due to complications from birth.[417]
Kennedy and his wife were younger than the presidents and first ladies who preceded them, and both were popular in the media culture in ways more common to pop singers and movie stars than politicians, influencing fashion trends and becoming the subjects of photo spreads in popular magazines. Although Eisenhower had allowed presidential press conferences to be filmed for television, Kennedy was the first president to ask for them to be broadcast live and made good use of the medium.[418] In 1961, the Radio-Television News Directors Association presented Kennedy with its highest honor, the Paul White Award, in recognition of his open relationship with the media.[419]
The Kennedys invited a range of artists, writers and intellectuals to White House dinners, raising the profile of the arts in America. On the White House lawn, they established a swimming pool and tree house, while Caroline attended a preschool with 10 other children inside the home.[420][421]
Vaughn Meader's First Family comedy album, which parodied the president, the first lady, their family, and the administration, sold about four million copies.[422]
Despite a privileged youth, Kennedy was plagued by childhood diseases, including whooping cough, chicken pox, measles, and ear infections. These ailments compelled him to spend a considerable amount of time convalescing. Three months prior to his third birthday, in 1920, Kennedy came down with scarlet fever, a highly contagious and life-threatening disease, and was admitted to Boston City Hospital.[423][14]
Kennedy and Jackie leaving the hospital following his spinal surgery, December 1954
During his years at Choate, Kennedy was beset by health problems that culminated with his emergency hospitalization in 1934 at Yale New Haven Hospital, where doctors suspected leukemia.[424] While sick, he became a passionate reader and also a fatalist.[425] In June 1934, he was admitted to the Mayo Clinic in Minnesota; the ultimate diagnosis was colitis.[424] After withdrawing from Princeton University, Kennedy was hospitalized for observation at Peter Bent Brigham Hospital in Boston. He then spent the spring of 1936 working as a ranch hand outside Benson, Arizona under Jack Speiden.[426]
Years after Kennedy's death, it was revealed that in September 1947, when he was 30 and in his first term in Congress, he was diagnosed by Sir Daniel Davis at The London Clinic with Addison's disease. Davis estimated that Kennedy would not live for another year, while Kennedy hoped he could live for ten.[427] In 1966, White House physician Janet Travell revealed that Kennedy also had hypothyroidism. The presence of two endocrine diseases raises the possibility that Kennedy had autoimmune polyendocrine syndrome type 2.[428]
Kennedy suffered from chronic severe back pain, for which he had surgery. His condition may have had diplomatic repercussions, as he appears to have been taking a combination of drugs to treat back pain during the 1961 Vienna Summit. The combination included hormones, animal organ cells, steroids, vitamins, enzymes, and amphetamines, and possible side effects included hyperactivity, hypertension, impaired judgment, nervousness, and mood swings.[429] Kennedy at one time was regularly seen by three doctors, one of whom, Max Jacobson, at first was unknown to the other two, as his mode of treatment was controversial[430] and used for the most severe bouts of back pain.[431]
Into late 1961, disagreements existed among Kennedy's doctors concerning the balance of medication and exercise. Kennedy preferred the former because he was short on time and desired immediate relief.[298] The president's primary White House physician, George G. Burkley, set up gym equipment in the White House basement, where Kennedy did stretching exercises thrice weekly.[432] Details of these and other medical problems were not publicly disclosed during Kennedy's lifetime.[433] Burkley realized that treatments by Jacobson and Travell, including excessive use of steroids and amphetamines, were medically inappropriate, and took action to remove Kennedy from their care.[434]
In 2002, Robert Dallek wrote an extensive history of Kennedy's health based on a collection of Kennedy-associated papers from 1955 to 1963, including X-rays and prescription records from Travell. According to Travell's records, during his presidential years Kennedy suffered from high fevers; stomach, colon, and prostate issues; abscesses; high cholesterol; and adrenal problems. Travell kept a "Medicine Administration Record", cataloging Kennedy's medications:
injected and ingested corticosteroids for his adrenal insufficiency; procaine shots and ultrasound treatments and hot packs for his back; Lomotil, Metamucil, paregoric, phenobarbital, testosterone, and trasentine to control his diarrhea, abdominal discomfort, and weight loss; penicillin and other antibiotics for his urinary-tract infections and an abscess; and Tuinal to help him sleep.[424]
The full extent of Kennedy's relationship with Monroe (who in 1962 famously sang "Happy Birthday, Mr. President" at Kennedy's birthday celebration at Madison Square Garden) is not known, though it has been reported that they spent a weekend together in March 1962 while he was staying at Bing Crosby's house.[444] Furthermore, people at the White House switchboard noted that Monroe had called Kennedy during 1962.[445] J. Edgar Hoover, the FBI director, received reports about Kennedy's indiscretions.[446] These included an alleged tryst with a suspected East German spy, Ellen Rometsch. According to historian Michael Beschloss, in July 1963, Hoover reportedly informed Robert Kennedy about the affair with a woman "suspected as a Soviet intelligence agent, someone linked to East German intelligence." Robert reportedly took the matter sufficiently seriously to raise it with leading Democratic and Republican figures in Congress.[447][448] However, the FBI never turned up "any solid evidence" that Rometsch was a spy or that she had relations with President Kennedy.[449] Former Secret Service agent Larry Newman recalled "morale problems" that the president's indiscretions engendered within the Secret Service.[450]
Kennedy inspired affection and loyalty from the members of his team and his supporters.[451] According to Reeves, this included "the logistics of Kennedy's liaisons ... [which] required secrecy and devotion rare in the annals of the energetic service demanded by successful politicians."[452] Kennedy believed that his friendly relationship with members of the press would help protect him from public revelations about his sex life.[453]
Kennedy was the first Catholic elected to the presidency.[458] During his childhood, he attended St. Aidan's Church in Brookline, Massachusetts, where he was baptized on June 19, 1917.[459]
Historians and political scientists tend to rank Kennedy as an above-average president, and he is usually the highest-ranking president who served less than one full term.[460] A 2010 survey by the Gallup Organization, when Americans were asked their opinions of modern presidents, Kennedy was found to be the most-popular, with an 85 percent retrospective approval rating.[461] A 2014 survey from The Washington Post of 162 members of the American Political Science Association's Presidents and Executive Politics section ranked Kennedy 14th highest overall among the 43 persons who have been president, including then-president Barack Obama. The survey found Kennedy to be the most overrated U.S. president.[462] A 2017 C-SPAN survey has Kennedy ranked among the top ten presidents.[463] A 2023 Gallup survey showed Kennedy with a retrospective approval rating of 90 percent, the highest of all U.S. presidents in recent history.[464] Assessments of his policies are mixed.[465][466] Many of Kennedy's legislative proposals were passed after his death, during the Johnson administration, and Kennedy's death gave those proposals a powerful moral component.[467]
Official White House portrait of Kennedy, by Aaron Shikler
The term "Camelot" is often used to describe his presidency, reflecting both the mythic grandeur accorded Kennedy in death and powerful nostalgia for that era of American history.[472] According to Richard Dean Burns and Joseph M. Siracusa, the most popular theme surrounding Kennedy's legacy is its replay of the legend of King Arthur and Camelot from Arthurian Literature.[473] In an interview following Kennedy's death, his widow Jacqueline mentioned his affection for the Broadway musical Camelot and quoted its closing lines: "Don't let it be forgot, that once there was a spot, for one brief, shining moment that was known as Camelot."[474][475] Critics, especially historians, have mocked the Camelot myth as a distortion of Kennedy's actions, beliefs, and policies. However, in the public memory, the years of Kennedy's presidency are still seen as a brief, brilliant, and shining moment.[476][477]
^After the war, Kennedy contacted the captain of the Amagiri, Kohei Hanami, and formed a friendship with him. Hanami later supported Kennedy's election campaign.[49]
^ ab"Life of John F. Kennedy". John F. Kennedy Presidential Library and Museum. Archived from the original on February 6, 2019. Retrieved February 7, 2019. This article incorporates text from this source, which is in the public domain.
^"「きのうの敵は今日の友」― ケネディ大統領と日本人艦長の友情秘話". American View (in Japanese). Embassy of the United States of America in Japan. April 5, 2015. Archived from the original on July 26, 2020. Retrieved July 26, 2020.
^"Joseph P. Kennedy Jr". John F. Kennedy Presidential Library & Museum. June 6, 2019. Archived from the original on December 13, 2023. Retrieved December 13, 2023.
^Tofel, Richard J. (May 9, 2008). "In His Own Words". The Wall Street Journal. Archived from the original on February 23, 2015. Retrieved March 28, 2010.
^Casey, Shaun A. (2009). The Making of a Catholic President: Kennedy vs. Nixon 1960. New York City: Oxford University Press.
^Lacroix, Patrick (2021). John F. Kennedy and the Politics of Faith. Lawrence: University Press of Kansas. pp. 21–44.
^ ab"Campaign of 1960". John F. Kennedy Presidential Library & Museum. March 2, 2022. Archived from the original on October 17, 2023. Retrieved October 15, 2023.
^Selverstone, Marc J. (October 4, 2016). "The Campaign and Election of 1960". University of Virginia: Miller Center. Archived from the original on April 29, 2017. Retrieved October 29, 2023.
^"JFK (Part 1)". American Experience. Season 25. Episode 7. November 11, 2013. PBS. WGBH. Archived from the original on September 25, 2019. Retrieved September 24, 2019.
^Hoberek, Andrew, ed. (2015). The Cambridge Companion to John F. Kennedy. Cambridge Companions to American Studies. New York: Cambridge University Press. p. 1. ISBN978-1-107-66316-9.
^Robert G. Lewis, "What Food Crisis?: Global Hunger and Farmers' Woes." World Policy Journal 25.1 (2008): 29–35. onlineArchived January 9, 2020, at the Wayback Machine
^Michael O'Brien, John F. Kennedy: A biography (2005) pp. 867–68.
^ ab"John F. Kennedy and African Independence". John F. Kennedy Presidential Library & Museum. March 6, 2019. Archived from the original on November 12, 2023. Retrieved November 19, 2023. This article incorporates text from this source, which is in the public domain.
^"Kennedy's Global Challenges". U.S. History: From Pre-Columbian to the New Millennium. Archived from the original on November 19, 2023. Retrieved November 19, 2023.
^"Peace Corps". John F. Kennedy Presidential Library & Museum. Archived from the original on December 2, 2023. Retrieved January 27, 2024.
^Quesada, Alejandro de (2009). The Bay of Pigs: Cuba 1961. Elite series #166. Illustrated by Stephen Walsh. Osprey Publishing. p. 17.
^"The Bay of Pigs". John F. Kennedy Presidential Library & Museum. Archived from the original on February 23, 2021. Retrieved November 19, 2023. This article incorporates text from this source, which is in the public domain.
^JFK's "Address on the First Anniversary of the Alliance for Progress", White House reception for diplomatic cors of the Latin American republics, March 13, 1962. Public Papers of the Presidents – John F. Kennedy (1962), p. 223.
^ ab"Alliance for Progress". John F. Kennedy Presidential Library & Museum. December 15, 2021. Archived from the original on November 12, 2023. Retrieved November 19, 2023. This article incorporates text from this source, which is in the public domain.
^Logevall, Frederick (1998), "Vietnam and the Question of What Might Have Been," in Mark J. White, ed. Kennedy: The New Frontier Revisited, New York: New York University Press, pp. 25, 27, 34–53
^Shannon, Vaughn P. (2003). Balancing Act: US Foreign Policy and the Arab-Israeli Conflict. Aldershot: Ashgate Publishing. p. 55. ISBN0754635910.
^Zachary K. Goldman, "Ties that bind: John F. Kennedy and the foundations of the American–Israeli alliance: The Cold War and Israel." Cold War History 9.1 (2009): 23–58, quoting Ben-Zvi on p 25.
^"Nuclear Test Ban Treaty". John F. Kennedy Presidential Library & Museum. Archived from the original on July 19, 2019. Retrieved November 19, 2023. This article incorporates text from this source, which is in the public domain.
^Selverstone, Marc (2011). "JFK and the Space Race". White House Tapes–Presidential Recordings Program, Miller Center of Public Affairs, University of Virginia. Archived from the original on March 5, 2012. Retrieved February 26, 2012.
^"Life of Jacqueline B. Kennedy". John F. Kennedy Presidential Library & Museum. December 15, 2021. Archived from the original on December 18, 2023. Retrieved December 13, 2023.
^"The White House Restoration". John F. Kennedy Presidential Library & Museum. December 15, 2021. Archived from the original on December 13, 2023. Retrieved December 13, 2023.
^"Summertime Sailing". John F. Kennedy Presidential Library & Museum. February 28, 2022. Archived from the original on March 19, 2024. Retrieved March 19, 2024.
^Buccellato, Robert (2021). Images of America: Presidential Vacations in Florida. Arcadia Publishing. p. 98.
^"Life of John F. Kennedy". John F. Kennedy Presidential Library & Museum. April 26, 2023. Archived from the original on March 21, 2024. Retrieved March 21, 2024.
^"John F. Kennedy: Impact and Legacy". Miller Center of Public Affairs, University of Virginia. October 4, 2016. Archived from the original on March 18, 2017. Retrieved April 28, 2017.
^Linda Czuba Brigance, "For One Brief Shining Moment: Choosing to Remember Camelot." Studies in Popular Culture 25.3 (2003): 1–12 onlineArchived September 6, 2023, at the Wayback Machine
^Richard Dean Burns and Joseph M. Siracusa, Historical Dictionary of the Kennedy-Johnson Era (Rowman & Littlefield, 2015) pp. 75–76.
The 1963 LIFE article represented the first use of the term "Camelot" in print and is attributed with having played a major role in establishing and fixing this image of the Kennedy Administration and period in the popular mind.
^An Epilogue, in LIFE, December 6, 1963, pp. 158–159
^Richard Dean Burns and Joseph M. Siracusa, Historical Dictionary of the Kennedy–Johnson Era (Rowman & Littlefield, 2015) pp. 75–76.
^Jon Goodman, et al., The Kennedy Mystique: Creating Camelot (National Geographic Books, 2006).
Bilharz, Joy Ann (2002) [1998]. The Allegany Senecas and Kinzua Dam: Forced Relocation Through Two Generations. Lincoln: University of Nebraska Press. ISBN978-0-8032-1282-4.
Blight, James G.; Lang, Janet M. (2005). The Fog of War: Eleven Lessons from the Life of Robert S. McNamara. Lanham, MD: Rowman & Littlefield. ISBN978-0-7425-4221-1.
Brauer, Carl M. (2002). "John F. Kennedy". In Graff, Henry (ed.). The Presidents: A Reference History (2nd ed.). Macmillan Library Reference USA. pp. 481–498. ISBN0-684-80551-0.
Walton, Hanes Jr.; Smith, Robert C. (2000). American Politics and the African American Quest for Universal Freedom. New York: Addison, Wesley, Longman. ISBN978-0-321-07038-8.
The Torch is Passed: The Associated Press Story of the Death of a President. New York: Associated Press. 1963. ISBN978-0861015689. {{cite book}}: ISBN / Date incompatibility (help)
NBC News (1966). There Was a President. New York: Random House.
Grant was born in Ohio and graduated from the United States Military Academy (West Point) in 1843. He served with distinction in the Mexican–American War, but resigned from the army in 1854 and returned to civilian life impoverished. In 1861, shortly after the Civil War began, Grant joined the Union Army and rose to prominence after securing victories in the western theater. In 1863, he led the Vicksburg campaign that gave Union forces control of the Mississippi River and dealt a major strategic blow to the Confederacy. President Abraham Lincoln promoted Grant to lieutenant general and command of all Union armies after his victory at Chattanooga. For thirteen months, Grant fought Robert E. Lee during the high-casualty Overland Campaign which ended with the capture of Lee's army at Appomattox, where he formally surrendered to Grant. In 1866, President Andrew Johnson promoted Grant to General of the Army. Later, Grant broke with Johnson over Reconstruction policies. A war hero, drawn in by his sense of duty, Grant was unanimously nominated by the Republican Party and then elected president in 1868.
Leaving office in 1877, Grant undertook a world tour, becoming the first president to circumnavigate the world. In 1880, he was unsuccessful in obtaining the Republican nomination for a non-consecutive third term. In 1885, impoverished and dying of throat cancer, Grant wrote his memoirs, covering his life through the Civil War, which were posthumously published and became a major critical and financial success. At his death, Grant was the most popular American and was memorialized as a symbol of national unity. Due to the pseudohistorical and negationist mythology of the Lost Cause of the Confederacy spread by Confederate sympathizers around the turn of the 20th century, historical assessments and rankings of Grant's presidency suffered considerably before they began recovering in the 21st century. Grant's critics take a negative view of his economic mismanagement and the corruption within his administration, while his admirers emphasize his policy towards Native Americans, vigorous enforcement of civil and voting rights for African Americans, and securing North and South as a single nation within the Union.[2] Modern scholarship has better appreciated Grant's appointments of Cabinet reformers.
Grant's father Jesse Root Grant was a Whig Party supporter and a fervent abolitionist.[3] Jesse and Hannah Simpson were married on June 24, 1821, and their first child, Hiram Ulysses Grant, was born on April 27, 1822.[4] The name Ulysses was drawn from ballots placed in a hat. To honor his father-in-law, Jesse named the boy "Hiram Ulysses", though he always referred to him as "Ulysses".[5] In 1823, the family moved to Georgetown, Ohio, where five siblings were born: Simpson, Clara, Orvil, Jennie, and Mary.[6] At the age of five, Ulysses started at a subscription school and later attended two private schools.[7] In the winter of 1836–1837, Grant was a student at Maysville Seminary, and in the autumn of 1838, he attended John Rankin's academy.
In his youth, Grant developed an unusual ability to ride and manage horses;[8] his father gave him work driving supply wagons and transporting people.[9] Unlike his siblings, Grant was not forced to attend church by his Methodist parents.[10] For the rest of his life, he prayed privately and never officially joined any denomination.[11] To others, including his own son, Grant appeared to be agnostic.[12] Grant was largely apolitical before the war but wrote, "If I had ever had any political sympathies they would have been with the Whigs. I was raised in that school."[13]
At Jesse Grant's request, Representative Thomas L. Hamer nominated Ulysses to the United States Military Academy at West Point, New York, in spring 1839. Grant was accepted on July 1.[14] Unfamiliar with Grant, Hamer altered his name, so Grant was enlisted under the name "U. S. Grant".[b][18] Since the initials "U.S." also stood for "Uncle Sam", he became known among army colleagues as "Sam."[19]
Initially, Grant was indifferent to military life, but within a year he reexamined his desire to leave the academy and later wrote that "on the whole I like this place very much".[20] He earned a reputation as the "most proficient" horseman.[21] Seeking relief from military routine, he studied under Romantic artist Robert Walter Weir, producing nine surviving artworks.[22] He spent more time reading books from the library than his academic texts.[23] On Sundays, cadets were required to march to services at the academy's church, which Grant disliked.[24] Quiet by nature, he established a few intimate friends among fellow cadets, including Frederick Tracy Dent and James Longstreet. He was inspired both by the Commandant, Captain Charles Ferguson Smith, and by General Winfield Scott, who visited the academy to review the cadets. Grant later wrote of the military life, "there is much to dislike, but more to like."[25]
Grant graduated on June 30, 1843, ranked 21st out of 39 in his class and was promoted the next day to brevetsecond lieutenant.[26] He planned to resign his commission after his four-year term. He would later write that among the happiest days of his life were the day he left the presidency and the day he left the academy.[27] Despite his excellent horsemanship, he was not assigned to the cavalry, but to the 4th Infantry Regiment.[c] Grant's first assignment was the Jefferson Barracks near St. Louis, Missouri.[29] Commanded by Colonel Stephen W. Kearny, this was the nation's largest military base in the West.[30] Grant was happy with his commander but looked forward to the end of his military service and a possible teaching career.[31]
In 1844, Grant accompanied Frederick Dent to Missouri and met his family, including Dent's sister Julia. The two soon became engaged.[31] On August 22, 1848, they were married at Julia's home in St. Louis. Grant's abolitionist father disapproved of the Dents' owning slaves, and neither of Grant's parents attended the wedding.[32] Grant was flanked by three fellow West Point graduates in their blue uniforms, including Longstreet, Julia's cousin.[d][35]
The couple had four children: Frederick, Ulysses Jr. ("Buck"), Ellen ("Nellie"), and Jesse II.[36] After the wedding, Grant obtained a two-month extension to his leave and returned to St. Louis, where he decided that, with a wife to support, he would remain in the army.[37]
Grant's unit was stationed in Louisiana as part of the Army of Occupation under Major General Zachary Taylor.[38] In September 1846, President James K. Polk ordered Taylor to march 150 miles (240 km) south to the Rio Grande. Marching to Fort Texas, to prevent a Mexican siege, Grant experienced combat for the first time on May 8, 1846, at the Battle of Palo Alto.[39] Grant served as regimental quartermaster, but yearned for a combat role; when finally allowed, he led a charge at the Battle of Resaca de la Palma.[40] He demonstrated his equestrian ability at the Battle of Monterrey by volunteering to carry a dispatch past snipers; he hung off the side of his horse, keeping the animal between him and the enemy.[41] Polk, wary of Taylor's growing popularity, divided his forces, sending some troops (including Grant's unit) to form a new army under Major General Winfield Scott.[42]
Traveling by sea, Scott's army landed at Veracruz and advanced toward Mexico City.[43] They met the Mexican forces at the battles of Molino del Rey and Chapultepec.[44] For his bravery at Molino del Rey, Grant was brevetted first lieutenant on September 30.[45] At San Cosmé, Grant directed his men to drag a disassembled howitzer into a church steeple, then reassembled it and bombarded nearby Mexican troops.[44] His bravery and initiative earned him his brevet promotion to captain.[46] On September 14, 1847, Scott's army marched into the city; Mexico ceded the vast territory, including California, to the U.S. on February 2, 1848.[47]
During the war, Grant established a commendable record as a daring and competent soldier and began to consider a career in the army.[48][49] He studied the tactics and strategies of Scott and Taylor and emerged as a seasoned officer, writing in his memoirs that this is how he learned much about military leadership.[50] In retrospect, although he respected Scott, he identified his own leadership style with Taylor's. Grant later believed the Mexican war was morally unjust and that the territorial gains were designed to expand slavery. He opined that the Civil War was divine punishment for U.S. aggression against Mexico.[51]
Historians have pointed to the importance of Grant's experience as an assistant quartermaster during the war. Although he was initially averse to the position, it prepared Grant in understanding military supply routes, transportation systems, and logistics, particularly with regard to "provisioning a large, mobile army operating in hostile territory", according to biographer Ronald White.[40] Grant came to recognize how wars could be won or lost by factors beyond the battlefield.[52]
Grant's first post-war assignments took him and Julia to Detroit on November 17, 1848, but he was soon transferred to Madison Barracks, a desolate outpost in upstate New York, in bad need of supplies and repair. After four months, Grant was sent back to his quartermaster job in Detroit.[53] When the discovery of gold in California brought prospectors and settlers to the territory, Grant and the 4th infantry were ordered to reinforce the small garrison there. Grant was charged with bringing the soldiers and a few hundred civilians from New York City to Panama, overland to the Pacific and then north to California. Julia, eight months pregnant with Ulysses Jr., did not accompany him.[54]
While Grant was in Panama, a cholera epidemic killed many soldiers and civilians. Grant organized a field hospital in Panama City, and moved the worst cases to a hospital barge offshore.[55] When orderlies protested having to attend to the sick, Grant did much of the nursing himself, earning high praise from observers.[54] In August, Grant arrived in San Francisco. His next assignment sent him north to Vancouver Barracks in the Oregon Territory.[56]
Grant tried several business ventures but failed, and in one instance his business partner absconded with $800 of Grant's investment, equivalent to $23,000 in 2023.[57] After he witnessed white agents cheating local Indians of their supplies, and their devastation by smallpox and measles transferred to them by white settlers, he developed empathy for their plight.[58]
Promoted to captain on August 5, 1853, Grant was assigned to command Company F, 4th Infantry, at the newly constructed Fort Humboldt in California.[59] Grant arrived at Fort Humboldt on January 5, 1854, commanded by Lieutenant Colonel Robert C. Buchanan.[60] Separated from his family, Grant began to drink.[61] Colonel Buchanan reprimanded Grant for one drinking episode and told Grant to "resign or reform." Grant told Buchanan he would "resign if I don't reform."[62] On Sunday, Grant was found influenced by alcohol, but not incapacitated, at his company's paytable.[63] Keeping his pledge to Buchanan, Grant resigned, effective July 31, 1854.[64] Buchanan endorsed Grant's resignation but did not submit any report that verified the incident.[e][70] Grant did not face court-martial, and the War Department said: "Nothing stands against his good name."[71] Grant said years later, "the vice of intemperance (drunkenness) had not a little to do with my decision to resign."[72] With no means of support, Grant returned to St. Louis and reunited with his family.[73]
"Hardscrabble", the log house built by Grant in between wars
In 1854, at age 32, Grant entered civilian life, without any money-making vocation to support his growing family. It was the beginning of seven years of financial struggles and instability.[74] Grant's father offered him a place in the Galena, Illinois, branch of the family's leather business, but demanded Julia and the children stay in Missouri, with the Dents, or with the Grants in Kentucky. Grant and Julia declined. For the next four years, Grant farmed with the help of Julia's slave, Dan, on his brother-in-law's property, Wish-ton-wish, near St. Louis.[75] The farm was not successful and to earn a living he sold firewood on St. Louis street corners.[76]
In 1856, the Grants moved to land on Julia's father's farm, and built a home called "Hardscrabble" on Grant's Farm; Julia described it as an "unattractive cabin".[77] Grant's family had little money, clothes, and furniture, but always had enough food.[78] During the Panic of 1857, which devastated Grant as it did many farmers, Grant pawned his gold watch to buy Christmas gifts.[79] In 1858, Grant rented out Hardscrabble and moved his family to Julia's father's 850-acre plantation.[80] That fall, after having malaria, Grant gave up farming.[81] Fearing that electing a Republican president would lead to a civil war, he voted for Democrat James Buchanan in 1856. He had the same fear in 1860 and preferred Douglas, the Democrat. However he did not vote in 1860 because he lacked the resident requirement in Galena.[82]
In 1858, Grant acquired a slave from his father-in-law, a thirty-five-year-old man named William Jones.[83] Although Grant was not an abolitionist at the time, he disliked slavery and could not bring himself to force an enslaved man to work.[84] In March 1859, Grant freed Jones by a manumission deed, potentially worth at least $1,000 (equivalent to $35,000 in 2024).[85]
Grant moved to St. Louis, taking on a partnership with Julia's cousin Harry Boggs working in the real estate business as a bill collector, again without success and at Julia's prompting ended the partnership.[86] In August, Grant applied for a position as county engineer. He had thirty-five notable recommendations, but Grant was passed over by the Free Soil and Republican county commissioners because he was believed to share his father-in-law's Democratic sentiments.[87]
In April 1860, Grant and his family moved north to Galena, accepting a position in his father's leather goods business, "Grant & Perkins", run by his younger brothers Simpson and Orvil. In a few months, Grant paid off his debts.[88] The family attended the local Methodist church and he soon established himself as a reputable citizen.[89]
On April 12, 1861, the American Civil War began when Confederate troops attacked Fort Sumter in Charleston, South Carolina.[90] The news came as a shock in Galena, and Grant shared his neighbors' concern about the war.[91] On April 15, Lincoln called for 75,000 volunteers.[92] The next day, Grant attended a mass meeting to assess the crisis and encourage recruitment, and a speech by his father's attorney, John Aaron Rawlins, stirred Grant's patriotism.[93] In an April 21 letter to his father, Grant wrote out his views on the upcoming conflict: "We have a government and laws and a flag, and they must all be sustained. There are but two parties now, Traitors and Patriots."[94]
On April 18, Grant chaired a second recruitment meeting, but turned down a captain's position as commander of the newly formed militia company, hoping his experience would aid him to obtain a more senior rank.[95] His early efforts to be recommissioned were rejected by Major General George B. McClellan and Brigadier General Nathaniel Lyon. On April 29, supported by Congressman Elihu B. Washburne of Illinois, Grant was appointed military aide to Governor Richard Yates and mustered ten regiments into the Illinois militia. On June 14, again aided by Washburne, Grant was appointed colonel and put in charge of the 21st Illinois Volunteer Infantry Regiment;[96] he appointed John A. Rawlins as his aide-de-camp and brought order and discipline to the regiment. Soon after, Grant and the 21st Regiment were transferred to Missouri to dislodge Confederate forces.[97]
On August 5, with Washburne's aid, Grant was appointed brigadier general of volunteers.[98] Major General John C. Frémont, Union commander of the West, passed over senior generals and appointed Grant commander of the District of Southeastern Missouri.[99] On September 2, Grant arrived at Cairo, Illinois, assumed command by replacing Colonel Richard J. Oglesby, and set up his headquarters to plan a campaign down the Mississippi, and up the Tennessee and Cumberland rivers.[100]
After the Confederates moved into western Kentucky, taking Columbus, with designs on southern Illinois, Grant notified Frémont and, without waiting for his reply, advanced on Paducah, Kentucky, taking it without a fight on September 6.[101] Having understood the importance to Lincoln of Kentucky's neutrality, Grant assured its citizens, "I have come among you not as your enemy, but as your friend."[102] On November 1, Frémont ordered Grant to "make demonstrations" against the Confederates on both sides of the Mississippi, but prohibited him from attacking.[103]
On November 2, 1861, Lincoln removed Frémont from command, freeing Grant to attack Confederate soldiers encamped in Cape Girardeau, Missouri.[103] On November 5, Grant, along with Brigadier General John A. McClernand, landed 2,500 men at Hunter's Point, and on November 7 engaged the Confederates at the Battle of Belmont.[104] The Union army took the camp, but the reinforced Confederates under Brigadier Generals Frank Cheatham and Gideon J. Pillow forced a chaotic Union retreat.[105] Grant had wanted to destroy Confederate strongholds at Belmont, Missouri, and Columbus, Kentucky, but was not given enough troops and was only able to disrupt their positions. Grant's troops escaped back to Cairo under fire from the fortified stronghold at Columbus.[106] Although Grant and his army retreated, the battle gave his volunteers much-needed confidence and experience.[107]
Columbus blocked Union access to the lower Mississippi. Grant and lieutenant colonel James B. McPherson planned to bypass Columbus and move against Fort Henry on the Tennessee River. They would then march east to Fort Donelson on the Cumberland River, with the aid of gunboats, opening both rivers and allowing the Union access further south. Grant presented his plan to Henry Halleck, his new commander in the newly created Department of Missouri.[108] Halleck rebuffed Grant, believing he needed twice the number of troops. However, after consulting McClellan, he finally agreed on the condition that the attack would be in close cooperation with Navy flag officer Andrew H. Foote.[109] Foote's gunboats bombarded Fort Henry, leading to its surrender on February 6, 1862, before Grant's infantry even arrived.[110]
Grant ordered an immediate assault on Fort Donelson, which dominated the Cumberland River. Unaware of the garrison's strength, Grant, McClernand, and Smith positioned their divisions around the fort. The next day McClernand and Smith independently launched probing attacks on apparent weak spots but were forced to retreat. On February 14, Foote's gunboats began bombarding the fort, only to be repulsed by its heavy guns. The next day, Pillow attacked and routed McClernand's division. Union reinforcements arrived, giving Grant a total force of over 40,000 men. Grant was with Foote four miles away when the Confederates attacked. Hearing the battle, Grant rode back and rallied his troop commanders, riding over seven miles of freezing roads and trenches, exchanging reports. When Grant blocked the Nashville Road, the Confederates retreated back into Fort Donelson.[111] On February 16, Foote resumed his bombardment, signaling a general attack. Confederate generals John B. Floyd and Pillow fled, leaving the fort in command of Simon Bolivar Buckner, who submitted to Grant's demand for "unconditional and immediate surrender".[112]
Grant had won the first major victory for the Union, capturing Floyd's entire army of more than 12,000. Halleck was angry that Grant had acted without his authorization and complained to McClellan, accusing Grant of "neglect and inefficiency". On March 3, Halleck sent a telegram to Washington complaining that he had no communication with Grant for a week. Three days later, Halleck claimed "word has just reached me that ... Grant has resumed his bad habits (of drinking)."[113] Lincoln, regardless, promoted Grant to major general of volunteers and the Northern press treated Grant as a hero. Playing off his initials, they took to calling him "Unconditional Surrender Grant".[114]
Reinstated by Halleck at the urging of Lincoln and Secretary of War Edwin Stanton, Grant rejoined his army with orders to advance with the Army of the Tennessee into Tennessee. His main army was located at Pittsburg Landing, while 40,000 Confederate troops converged at Corinth, Mississippi.[115] Grant wanted to attack the Confederates at Corinth, but Halleck ordered him not to attack until Major General Don Carlos Buell arrived with his division of 25,000.[116] Grant prepared for an attack on the Confederate army of roughly equal strength. Instead of preparing defensive fortifications, they spent most of their time drilling the largely inexperienced troops while Sherman dismissed reports of nearby Confederates.[117]
On the morning of April 6, 1862, Grant's troops were taken by surprise when the Confederates, led by Generals Albert Sidney Johnston and P. G. T. Beauregard, struck first "like an Alpine avalanche" near Shiloh church, attacking five divisions of Grant's army and forcing a confused retreat toward the Tennessee River.[118] Johnston was killed and command fell upon Beauregard.[119] One Union line held the Confederate attack off for several hours, giving Grant time to assemble artillery and 20,000 troops near Pittsburg Landing.[120] The Confederates finally broke and captured a Union division, but Grant's newly assembled line held the landing, while the exhausted Confederates, lacking reinforcements, halted their advance.[121][f]
Bolstered by 18,000 troops from the divisions of Major Generals Buell and Lew Wallace, Grant counterattacked at dawn the next day and regained the field, forcing the disorganized and demoralized rebels to retreat to Corinth.[123] Halleck ordered Grant not to advance more than one day's march from Pittsburg Landing, stopping the pursuit.[124] Although Grant had won the battle, the situation was little changed.[125] Grant, now realizing that the South was determined to fight, would later write, "Then, indeed, I gave up all idea of saving the Union except by complete conquest."[126]
Shiloh was the costliest battle in American history to that point and the staggering 23,746 casualties stunned the nation.[127] Briefly hailed a hero for routing the Confederates, Grant was soon mired in controversy.[128] The Northern press castigated Grant for shockingly high casualties, and accused him of drunkenness during the battle, contrary to the accounts of those with him at the time.[129] Discouraged, Grant considered resigning but Sherman convinced him to stay.[130] Lincoln dismissed Grant's critics, saying "I can't spare this man; he fights."[131] Grant's costly victory at Shiloh ended any chance for the Confederates to prevail in the Mississippi valley or regain its strategic advantage in the West.[132]
Halleck arrived from St. Louis on April 11, took command, and assembled a combined army of about 120,000 men. On April 29, he relieved Grant of field command and replaced him with Major General George Henry Thomas. Halleck slowly marched his army to take Corinth, entrenching each night.[133] Meanwhile, Beauregard pretended to be reinforcing, sent "deserters" to the Union Army with that story, and moved his army out during the night, to Halleck's surprise when he finally arrived at Corinth on May 30.[134]
Halleck divided his combined army and reinstated Grant as field commander on July 11.[135] Later that year, on September 19, Grant's army defeated Confederates at the Battle of Iuka, then successfully defended Corinth, inflicting heavy casualties.[136] On October 25, Grant assumed command of the District of the Tennessee.[137] In November, after Lincoln's preliminary Emancipation Proclamation, Grant ordered units under his command to incorporate former slaves into the Union Army, giving them clothes, shelter, and wages for their services.[138]
Grant's successful gamble: Porter's gunboats night ran the Confederate gauntlet at Vicksburg on the Mississippi River.
The Union capture of Vicksburg, the last Confederate stronghold on the Mississippi River, was considered vital as it would split the Confederacy in two.[139] Lincoln appointed McClernand for the job, rather than Grant or Sherman.[140] Halleck, who retained power over troop displacement, ordered McClernand to Memphis, and placed him and his troops under Grant's authority.[141]
On November 13, 1862, Grant captured Holly Springs and advanced to Corinth.[142] His plan was to attack Vicksburg overland, while Sherman would attack Vicksburg from Chickasaw Bayou.[143] However, Confederate cavalry raids on December 11 and 20 broke Union communications and recaptured Holly Springs, preventing Grant and Sherman from converging on Vicksburg.[144] McClernand reached Sherman's army, assumed command, and independently of Grant led a campaign that captured Confederate Fort Hindman.[145] After the sack of Holly Springs, Grant considered and sometimes adopted the strategy of foraging the land,[146] rather than exposing long Union supply lines to enemy attack.[147]
Fugitive African-American slaves poured into Grant's district, whom he sent north to Cairo to be domestic servants in Chicago. However, Lincoln ended this when Illinois political leaders complained.[148] On his own initiative, Grant set up a pragmatic program and hired Presbyterian chaplain John Eaton to administer contraband camps.[149] Freed slaves picked cotton that was shipped north to aid the Union war effort. Lincoln approved and Grant's program was successful.[150] Grant also worked freed black labor on a canal to bypass Vicksburg, incorporating the laborers into the Union Army and Navy.[151]
Grant's war responsibilities included combating illegal Northern cotton trade and civilian obstruction.[152][g] He had received numerous complaints about Jewish speculators in his district.[155] The majority, however, of those involved in illegal trading were not Jewish.[156] To help combat this, Grant required two permits, one from the Treasury and one from the Union Army, to purchase cotton.[153] On December 17, 1862, Grant issued a controversial General Order No. 11, expelling "Jews, as a class", from his military district.[157] After complaints, Lincoln rescinded the order on January 3, 1863. Grant finally ended the order on January 17. He later described issuing the order as one of his biggest regrets.[h][161]
On January 29, 1863, Grant assumed overall command. To bypass Vicksburg's guns, Grant slowly advanced his Union army south through water-logged terrain.[162] The plan of attacking Vicksburg from downriver was risky because, east of the river, his army would be distanced from most of its supply lines,[163] and would have to rely on foraging. On April 16, Grant ordered Admiral David Dixon Porter's gunboats south under fire from the Vicksburg batteries to meet up with troops who had marched south down the west side of the river.[164] Grant ordered diversionary battles, confusing Pemberton and allowing Grant's army to move east across the Mississippi.[165] Grant's army captured Jackson. Advancing west, he defeated Pemberton's army at the Battle of Champion Hill on May 16, forcing their retreat into Vicksburg.[166]
After Grant's men assaulted the entrenchments twice, suffering severe losses, they settled in for a siege which lasted seven weeks. During quiet periods of the campaign, Grant would drink on occasion.[167] The personal rivalry between McClernand and Grant continued until Grant removed him from command when he contravened Grant by publishing an order without permission.[168] Pemberton surrendered Vicksburg to Grant on July 4, 1863.[169]
Vicksburg's fall gave Union forces control of the Mississippi River and split the Confederacy. By that time, Grant's political sympathies fully coincided with the Radical Republicans' aggressive prosecution of the war and emancipation of the slaves.[170] The success at Vicksburg was a morale boost for the Union war effort.[168] When Stanton suggested Grant be brought east to run the Army of the Potomac, Grant demurred, writing that he knew the geography and resources of the West better and he did not want to upset the chain of command in the East.[171]
On October 16, 1863, Lincoln promoted Grant to major general in the regular army and assigned him command of the newly formed Division of the Mississippi, which comprised the Armies of the Ohio, the Tennessee, and the Cumberland.[172] After the Battle of Chickamauga, the Army of the Cumberland retreated into Chattanooga, where they were partially besieged.[173] Grant arrived in Chattanooga, where plans to resupply and break the partial siege had already been set. Forces commanded by Major General Joseph Hooker, which had been sent from the Army of the Potomac, approached from the west and linked up with other units moving east from inside the city, capturing Brown's Ferry and opening a supply line to the railroad at Bridgeport.[174]
Grant planned to have Sherman's Army of the Tennessee, assisted by the Army of the Cumberland, assault the northern end of Missionary Ridge and roll down it on the enemy's right flank. On November 23, Major General George Henry Thomas surprised the enemy in open daylight, advancing the Union lines and taking Orchard Knob, between Chattanooga and the ridge. The next day, Sherman failed to get atop Missionary Ridge, which was key to Grant's plan of battle. Hooker's forces took Lookout Mountain in unexpected success.[175] On the 25th, Grant ordered Thomas to advance to the rifle-pits at the base of Missionary Ridge after Sherman's army failed to take Missionary Ridge from the northeast.[176] Four divisions of the Army of the Cumberland, with the center two led by Major General Philip Sheridan and Brigadier General Thomas J. Wood, chased the Confederates out of the rifle-pits at the base and, against orders, continued the charge up the 45-degree slope and captured the Confederate entrenchments along the crest, forcing a hurried retreat.[177] The decisive battle gave the Union control of Tennessee and opened Georgia, the Confederate heartland, to Union invasion.[178]
On March 2, 1864, Lincoln promoted Grant to lieutenant general, giving him command of all Union Armies.[179] Grant's new rank had previously been held only by George Washington.[180] Grant arrived in Washington on March 8 and was formally commissioned by Lincoln the next day at a Cabinet meeting.[181] Grant developed a good working relationship with Lincoln, who allowed Grant to devise his own strategy.[182]
Grant established his headquarters with General George Meade's Army of the Potomac in Culpeper, Virginia, and met weekly with Lincoln and Stanton in Washington.[183] After protest from Halleck, Grant scrapped a risky invasion of North Carolina and planned five coordinated Union offensives to prevent Confederate armies from shifting troops along interior lines.[184] Grant and Meade would make a direct frontal attack on Robert E. Lee's Army of Northern Virginia, while Sherman—now in command of all western armies—would destroy Joseph E. Johnston's Army of Tennessee and take Atlanta.[185] Major General Benjamin Butler would advance on Lee from the southeast, up the James River, while Major General Nathaniel Banks would capture Mobile.[186] Major General Franz Sigel was to capture granaries and rail lines in the fertile Shenandoah Valley.[187] Grant now commanded 533,000 battle-ready troops spread out over an eighteen-mile front.[188]
The Overland Campaign was a series of brutal battles fought in Virginia during May and June 1864.[189] Sigel's and Butler's efforts failed, and Grant was left alone to fight Lee.[190] On May 4, Grant led the army from his headquarters towards Germanna Ford.[191] They crossed the Rapidan unopposed.[192] On May 5, the Union army attacked Lee in the battle of the Wilderness, a three-day battle with estimated casualties of 17,666 Union and 11,125 Confederate.[193]
Rather than retreat, Grant flanked Lee's army to the southeast and attempted to wedge his forces between Lee and Richmond at Spotsylvania Court House.[194] Lee's army got to Spotsylvania first and a costly battle ensued, lasting thirteen days, with heavy casualties.[195] On May 12, Grant attempted to break through Lee's Muleshoe salient guarded by Confederate artillery, resulting in one of the bloodiest assaults of the Civil War, known as the Bloody Angle.[196] Unable to break Lee's lines, Grant again flanked the rebels to the southeast, meeting at North Anna, where a battle lasted three days.[197]
The recent bloody Wilderness campaign had severely diminished Confederate morale;[198] Grant believed breaking through Lee's lines at its weakest point, Cold Harbor, a vital road hub that linked to Richmond, would mean a quick end to the war.[199] Grant already had two corps in position at Cold Harbor with Hancock's corps on the way.[200]
Lee's lines were extended north and east of Richmond and Petersburg for approximately ten miles, but at several points there were no fortifications built yet, including Cold Harbor. On June 1 and 2 both Grant and Lee were waiting for reinforcements to arrive. Hancock's men had marched all night and arrived too exhausted for an immediate attack that morning. Grant postponed the attack until 5 p.m., and then again until 4:30 a.m. on June 3. However, Grant and Meade did not give specific orders for the attack, leaving it up to the corps commanders to coordinate. Grant had not yet learned that overnight Lee had hastily constructed entrenchments to thwart any breach attempt at Cold Harbor.[201] Grant was anxious to make his move before the rest of Lee's army arrived. On the morning of June 3, with a force of more than 100,000 men, against Lee's 59,000, Grant attacked, not realizing that Lee's army was now well entrenched, much of it obscured by trees and bushes.[202] Grant's army suffered 12,000–14,000 casualties, while Lee's army suffered 3,000–5,000 casualties, but Lee was less able to replace them.[203]
The unprecedented number of casualties heightened anti-war sentiment in the North. After the battle, Grant wanted to appeal to Lee under the white flag for each side to gather up their wounded, most of them Union soldiers, but Lee insisted that a total truce be enacted and while they were deliberating all but a few of the wounded died in the field.[204] Without giving an apology for the disastrous defeat in his official military report, Grant confided in his staff after the battle and years later wrote in his memoirs that he "regretted that the last assault at Cold Harbor was ever made."[205]
Undetected by Lee, Grant moved his army south of the James River, freed Butler from the Bermuda Hundred, and advanced toward Petersburg, Virginia's central railroad hub,[206] resulting in a nine-month siege. Northern resentment grew. Sheridan was assigned command of the Union Army of the Shenandoah and Grant directed him to "follow the enemy to their death" in the Shenandoah Valley.[207] After Grant's abortive attempt to capture Petersburg, Lincoln supported Grant in his decision to continue.[208]
Grant had to commit troops to check Confederate General Jubal Early's raids in the Shenandoah Valley, which were getting dangerously close to Washington.[209] By late July, at Petersburg, Grant reluctantly approved a plan to blow up part of the enemy trenches from a tunnel filled with gunpowder. The massive explosion instantly killed an entire Confederate regiment.[210] The poorly led Union troops under Major General Ambrose Burnside and Brigadier General James H. Ledlie, rather than encircling the crater, rushed into it. Recovering from the surprise, Confederates, led by Major General William Mahone,[211] surrounded the crater and easily picked off Union troops. The Union's 3,500 casualties outnumbered the Confederates' three-to-one. The battle marked the first time that Union black troops, who endured a large proportion of the casualties, engaged in any major battle in the east.[212] Grant admitted that the tactic had been a "stupendous failure".[213]
Grant would later meet with Lincoln and testify at a court of inquiry against Generals Burnside and Ledlie for their incompetence.[214] In his memoirs, he blamed them for that disastrous Union defeat.[215] Rather than fight Lee in a full-frontal attack as he had done at Cold Harbor, Grant continued to force Lee to extend his defenses south and west of Petersburg, better allowing him to capture essential railroad links.[209]
Union forces soon captured Mobile Bay and Atlanta and now controlled the Shenandoah Valley, ensuring Lincoln's reelection in November.[216] Sherman convinced Grant and Lincoln to allow his army to march on Savannah.[217] Sherman cut a 60-mile (97 km) path of destruction unopposed, reached the Atlantic Ocean, and captured Savannah on December 22.[218] On December 16, after much prodding by Grant, the Union Army under Thomas smashed John Bell Hood's Confederates at Nashville.[219] These campaigns left Lee's forces at Petersburg as the only significant obstacle remaining to Union victory.[220]
By March 1865, Lee was trapped and his strength severely weakened.[221] He was running out of reserves to replace the high battlefield casualties and remaining Confederate troops, no longer having confidence in their commander and under the duress of trench warfare, deserted by the thousands.[222] On March 25, in a desperate effort, Lee sacrificed his remaining troops (4,000 Confederate casualties) at Fort Stedman, a Union victory and the last Petersburg line battle.
Defeated by Grant, Lee surrendered at Appomattox Court House.
On April 2, Grant ordered a general assault on Lee's forces; Lee abandoned Petersburg and Richmond, which Grant captured.[223] A desperate Lee and part of his army attempted to link up with the remnants of Joseph E. Johnston's army. Sheridan's cavalry stopped the two armies from converging, cutting them off from their supply trains.[224] Grant sent his aide Orville Babcock to carry his last dispatch to Lee demanding his surrender.[225] Grant immediately rode west, bypassing Lee's army, to join Sheridan who had captured Appomattox Station, blocking Lee's escape route. On his way, Grant received a letter from Lee stating Lee would surrender his army.[226]
On April 9, Grant and Lee met at Appomattox Court House.[227] Although Grant felt depressed at the fall of "a foe who had fought so long and valiantly," he believed the Southern cause was "one of the worst for which a people ever fought."[228] Grant wrote out the terms of surrender: "each officer and man will be allowed to return to his home, not to be disturbed by U.S. authority so long as they observe their paroles and the laws in force where they may reside." Lee immediately accepted Grant's terms and signed the surrender document, without any diplomatic recognition of the Confederacy. Lee asked that his former Confederate troops keep their horses, which Grant generously allowed.[229][230] Grant ordered his troops to stop all celebration, saying the "war is over; the rebels are our countrymen again."[231] Johnston's Tennessee army surrendered on April 26, 1865, Richard Taylor's Alabama army on May 4, and Kirby Smith's Texas army on May 26, ending the war.[232]
On April 14, 1865, Grant attended a cabinet meeting in Washington. Lincoln invited him and his wife Julia to Ford's Theatre but they declined, because they planned to travel to their home in Burlington. In a conspiracy that also targeted top cabinet members in one last effort to topple the Union, Lincoln was shot by John Wilkes Booth at the theater and died the next morning.[233] Many, including Grant himself, thought that Grant had been a target in the plot, and during the subsequent trial, the government tried to prove that Grant had been stalked by Booth's conspirator Michael O'Laughlen.[234] Stanton notified Grant of the president's death and summoned him to Washington. Vice President Andrew Johnson was sworn in as president on April 15.[235][i] Grant was determined to work with Johnson, and he privately expressed "every reason to hope" in the new president's ability to run the government "in its old channel".[236]
At the war's end, Grant remained commander of the army, with duties that included dealing with Emperor Maximilian and French troops in Mexico, enforcement of Reconstruction in the former Confederate states, and supervision of Indian wars on the western Plains.[237] After the Grand Review of the Armies, Lee and his generals were indicted for treason in Virginia. Johnson demanded they be put on trial, but Grant insisted that they should not be tried, citing his Appomatox amnesty. Charges against Lee were dropped.[238][239] Grant secured a house for his family in Georgetown Heights in 1865 but instructed Elihu Washburne that for political purposes his legal residence remained in Galena, Illinois.[240] On July 25, 1866, Congress promoted Grant to the newly created rank of General of the Army of the United States.[241]
President Johnson's Reconstruction policy included a speedy return of the former Confederates to Congress, reinstating white people to office in the South, and relegating black people to second-class citizenship.[242] On November 27, 1865, Grant was sent by Johnson on a fact-finding mission to the South, to counter a pending less favorable report by Senator Carl Schurz which reported that white people in the South harbored resentment of the North, and that black people suffered from violence and fraud.[243] Grant recommended continuation of the Freedmen's Bureau, which Johnson opposed, but advised against using black troops.[244]
Grant believed the people of the South were not ready for self-rule and required federal government protection. Concerned that the war led to diminished respect for civil authorities, he continued using the Army to maintain order.[245] Grant's report on the South, which he later recanted, sympathized with Johnson's Reconstruction policies.[246] Although Grant desired former Confederates be returned to Congress, he advocated eventual black citizenship. On December 19, the day after the passage of the Thirteenth Amendment was announced in the Senate, Johnson's response used Grant's report, read aloud to the Senate, to undermine Schurz's final report and Radical opposition to Johnson's policies.[247]
Cartoon illustration from Swingin' Round the Cirkle, or Andy's trip to the West by David Ross Locke, suggesting that Grant was a bigger draw on the multi-city tour than was Johnson
Grant was initially optimistic about Johnson.[248] Despite differing styles, the two got along cordially and Grant attended cabinet meetings concerning Reconstruction.[248] By February 1866, the relationship began to break down.[249] Johnson opposed Grant's closure of the Richmond Examiner for disloyal editorials and his enforcement of the Civil Rights Act of 1866, passed over Johnson's veto.[249] Needing Grant's popularity, Johnson took Grant on his "Swing Around the Circle" tour, a failed attempt to gain national support for lenient policies toward the South.[250] Grant privately called Johnson's speeches a "national disgrace" and he left the tour early.[251] On March 2, 1867, overriding Johnson's veto, Congress passed the first of three Reconstruction Acts, using military officers to enforce the policy.[252] Protecting Grant, Congress passed the Command of the Army Act, preventing his removal or relocation, and forcing Johnson to pass orders through Grant.[253]
In August 1867, bypassing the Tenure of Office Act, Johnson discharged Secretary of War Edwin Stanton without Senate approval and appointed Grant ad interim Secretary of War. Stanton was the only remaining cabinet member friendly to the Radicals. Although Grant initially recommended against dismissing Stanton, he accepted the position, not wanting the Army to fall under a conservative appointee who would impede Reconstruction, and managed an uneasy partnership with Johnson.[254]
In December 1867, Congress voted to keep Stanton, who was reinstated by a Senate Committee on January 10, 1868. Grant told Johnson he was going to resign the office to avoid fines and imprisonment. Johnson, who believed the law would be overturned, said he would assume Grant's legal responsibility, and reminded Grant that he had promised to delay his resignation until a suitable replacement was found.[255] The following Monday, not willing to wait for the law to be overturned, Grant surrendered the office to Stanton, causing confusion with Johnson.[256] With the backing of his cabinet, Johnson accused Grant of lying and "duplicity" at a stormy cabinet meeting, while a shocked and disappointed Grant felt it was Johnson who was lying.[257] The publication of angry messages between Grant and Johnson led to a complete break between them.[258] The controversy led to Johnson's impeachment and trial in the Senate; he was acquitted by one vote.[259] Grant's popularity rose among the Radical Republicans and his nomination for the presidency appeared certain.[260]
At the 1868 Republican National Convention, the delegates unanimously nominated Grant for president on the first ballot and Speaker of the House Schuyler Colfax for vice president on the fifth.[261] Although Grant had preferred to remain in the army, he accepted the Republican nomination, believing that he was the only one who could unify the nation.[262] The Republicans advocated "equal civil and political rights to all" and African American enfranchisement.[263] The Democrats, having abandoned Johnson, nominated former governor Horatio Seymour of New York for president and Francis P. Blair of Missouri for vice president. The Democrats opposed suffrage for African Americans and advocated the immediate restoration of former Confederate states to the Union and amnesty from "all past political offenses".[264][265]
Grant played no overt role during the campaign and was joined by Sherman and Sheridan in a tour of the West that summer.[266] However, the Republicans adopted his words "Let us have peace" as their campaign slogan.[267] Grant's 1862 General Order No. 11 became an issue during the presidential campaign; he sought to distance himself from the order, saying "I have no prejudice against sect or race, but want each individual to be judged by his own merit."[268] The Democrats and their Klan supporters focused mainly on ending Reconstruction, intimidating black people and Republicans, and returning control of the South to the white Democrats and the planter class, alienating War Democrats in the North.[269] Grant won the popular vote and an Electoral College landslide of 214 votes to Seymour's 80.[270] Seymour received a majority of white voters, but Grant was aided by 500,000 votes cast by black people,[271] winning him 52.7 percent of the popular vote.[272] He lost Louisiana and Georgia, primarily due to Ku Klux Klan violence against African-American voters.[273] At the age of 46, Grant was the youngest president yet elected.[274]
Official White House portrait of President Grant by Henry Ulke, 1875
On March 4, 1869, Grant was sworn in as President by Chief Justice Salmon P. Chase. In his inaugural address, Grant urged the ratification of the Fifteenth Amendment; many African Americans attended his inauguration.[275] He urged that bonds issued during the Civil War should be paid in gold, called for "proper treatment" of Native Americans and encouraged their "civilization and ultimate citizenship".[276]
Grant's cabinet appointments sparked both criticism and approval.[277] He appointed Elihu B. Washburne Secretary of State and John A. Rawlins Secretary of War.[278] Washburne resigned, and Grant appointed him Minister to France. Grant then appointed former New York Senator Hamilton Fish Secretary of State.[278] Rawlins died in office, and Grant appointed William W. Belknap Secretary of War.[279] Grant appointed New York businessman Alexander T. Stewart Secretary of Treasury, but Stewart was found legally ineligible by a 1789 law.[280] Grant then appointed Massachusetts Representative George S. Boutwell Secretary of Treasury.[278] Philadelphia businessman Adolph E. Borie was appointed Secretary of Navy, but found the job stressful and resigned.[281] Grant then appointed New Jersey's attorney general, George M. Robeson, Secretary of Navy.[282] Former Ohio Governor Jacob D. Cox (Interior), former Maryland Senator John Creswell (Postmaster-General), and Ebenezer Rockwood Hoar (Attorney General) rounded out the cabinet.[283]
Grant nominated Sherman to succeed him as general-in-chief and gave him control over war bureau chiefs.[284] When Rawlins took over the War Department he complained that Sherman was given too much authority. Grant reluctantly revoked his order, upsetting Sherman and damaging their friendship. James Longstreet, a former Confederate general, was nominated for Surveyor of Customs of New Orleans; this was met with amazement, and seen as a genuine effort to unite the North and South.[285] In March 1872, Grant signed legislation that established Yellowstone National Park, the first national park.[286] Grant was sympathetic to women's rights, including suffrage, saying he wanted "equal rights to all citizens".[287]
To make up for his infamous General Order No. 11, Grant appointed more than fifty Jewish people to federal office, including consuls, district attorneys, and deputy postmasters. He appointed Edward S. Salomon territorial governor of Washington, the first time an American Jewish man occupied a governor's seat. In November 1869, reports surfaced of Alexander II of Russia penalizing 2,000 Jewish families for smuggling by expelling them to the interior of the country. In response, Grant publicly supported the Jewish American B'nai B'rith petition against Alexander.[288] In 1875, Grant proposed a constitutional amendment that limited religious indoctrination in public schools.[289] Schools would be for all children "irrespective of sex, color, birthplace, or religions".[290] Grant's views were incorporated into the Blaine Amendment, but it was defeated by the Senate.[291]
In October 1871, under the Morrill Act, using federal marshals, Grant prosecuted hundreds of Utah Territory Mormon polygamists.[292] Grant called polygamy a "crime against decency and morality".[293] In 1874, Grant signed into law the Poland Act, which made Mormon polygamists subject to trial in District Courts and limited Mormons on juries.[293]
Beginning in March 1873, under the Comstock Act, Grant prosecuted pornographers, in addition to abortionists. To administer the prosecutions, Grant put in charge a vigorous anti-vice activist and reformer Anthony Comstock.[294] Comstock headed a federal commission and was empowered to destroy obscene material and hand out arrest warrants to offenders.[293]
Grant was considered an effective civil rights president, concerned about the plight of African Americans.[295] On March 18, 1869, Grant signed into law equal rights for black people, to serve on juries and hold office, in Washington D.C., and in 1870 he signed the Naturalization Act that gave foreign black people citizenship.[295] During his first term, Reconstruction took precedence. Republicans controlled most Southern states, propped up by Republican-controlled Congress, northern money, and southern military occupation.[296] Grant advocated the ratification of the Fifteenth Amendment that said states could not disenfranchise African Americans.[297] Within a year, the three remaining states—Mississippi, Virginia, and Texas—adopted the new amendment—and were admitted to Congress.[298] Grant put military pressure on Georgia to reinstate its black legislators and adopt the amendment.[299] Georgia complied, and on February 24, 1871, its senators were seated in Congress, with all former Confederate states represented, the Union was completely restored under Grant.[j][301] Under Grant, for the first time in history, Black-American men served in the United States Congress, all from the Southern states.[302]
In 1870, to enforce Reconstruction, Congress and Grant created the Justice Department that allowed the Attorney General and the new Solicitor General to prosecute the Klan.[303] Congress and Grant passed three Enforcement Acts, designed to protect black people and Reconstruction governments.[304] Using the Enforcement Acts, Grant crushed the Klan.[305] By October, Grant suspended habeas corpus in part of South Carolina and sent federal troops to help marshals, who initiated prosecutions.[306] Grant's Attorney General, Amos T. Akerman, who replaced Hoar, was zealous to destroy the Klan.[307] Akerman and South Carolina's U.S. marshal arrested over 470 Klan members, while hundreds of Klansmen, fled the state.[308] By 1872, the Klan's power had collapsed, and African Americans voted in record numbers in the South.[k][310] Attorney General George H. Williams, Akerman's replacement, suspended prosecutions of the Klan in 1873, but prior to the election of 1874, changed course and prosecuted the Klan.[l][314]
During Grant's second term, the North retreated from Reconstruction, while southern conservatives called "Redeemers" formed armed groups, the Red Shirts and the White League, who openly used violence, intimidation, voter fraud, and racist appeals to overturn Republican rule.[315] Northern apathy toward black people, the depressed economy and Grant's scandals made it politically difficult for the administration to maintain support for Reconstruction. Power shifted when the House was taken over by Democrats in the 1874 election.[316] Grant ended the Brooks–Baxter War, bringing Reconstruction in Arkansas to a peaceful conclusion. He sent troops to New Orleans in the wake of the Colfax massacre and disputes over the election of Governor William Pitt Kellogg.[317][318]
By 1875, Redeemer Democrats had taken control of all but three Southern states. As violence against black Southerners escalated, Grant's Attorney General Edwards Pierrepont told Republican Governor Adelbert Ames of Mississippi that the people were "tired of the autumnal outbreaks in the South", and declined to intervene directly.[319] Grant later regretted not issuing a proclamation to help Ames, having been told Republicans in Ohio would bolt the party if he did.[320] Grant told Congress in January 1875 he could not "see with indifference Union men or Republicans ostracized, persecuted, and murdered."[321] Congress refused to strengthen the laws against violence but instead passed the sweeping Civil Rights Act of 1875 to guarantee black people access to public facilities.[322] However, there was little enforcement and the Supreme Court ruled the law unconstitutional in 1883.[323] In 1876, Grant dispatched troops to South Carolina to keep Republican Governor Daniel Henry Chamberlain in office.[324] After Grant left office, the Compromise of 1877 meant Republicans obtained the White House for Rutherford B. Hayes in return for ending enforcement of racial equality for black people and removing federal troops from the South,[325] marking the end of Reconstruction.[326]
Soon after taking office, Grant took conservative steps to return the economy to pre-war monetary standards.[327] During the War, Congress had authorized the Treasury to issue banknotes that, unlike the rest of the currency, were not backed by gold or silver. These "greenbacks" were necessary to pay the war debts, but caused inflation and forced gold-backed money out of circulation.[328] On March 18, 1869, Grant signed the Public Credit Act of 1869, which guaranteed bondholders would be repaid in "coin or its equivalent". The act committed the government to the full return of the gold standard within ten years.[329] This followed a policy of "hard currency, economy and gradual reduction of the national debt." Grant's own ideas about the economy were simple, and he relied on the advice of businessmen.[327]
Photograph of the blackboard in the New York City Gold Room on Black Friday, showing the collapse of the price of gold
In April 1869, railroad tycoons Jay Gould and Jim Fisk conspired to corner the gold market in New York.[330] They controlled the Erie Railroad, and a high gold price would allow foreign agriculture buyers to purchase exported crops, shipped east over the Erie's routes.[331] Boutwell's policy of selling gold from the Treasury biweekly, however, kept gold artificially low.[332] Unable to corrupt Boutwell, the schemers built a relationship with Grant's brother-in-law, Abel Corbin, and gained access to Grant.[333] Gould bribed Assistant Treasurer Daniel Butterfield to gain inside information into the Treasury.[334]
In July, Grant reduced the sale of Treasury gold to $2,000,000 per month.[335] Fisk told Grant his gold selling policy would destroy the nation.[336] By September, Grant, who was naive regarding finance, was convinced a low gold price would help farmers, and the sale of gold for September was not decreased.[337] On September 23, when the gold price reached 143+1⁄8, Boutwell rushed to the White House and talked with Grant.[338] On September 24, known as Black Friday, Grant ordered Boutwell to sell, whereupon Boutwell wired Butterfield to sell $4,000,000 in gold.[339] The bull market at Gould's Gold Room collapsed, the price plummeted from 160 to 133+1⁄3, a bear market panic ensued, Gould and Fisk fled, and economic damages lasted months.[340] By January 1870, the economy resumed its post-war recovery.[m][342]
Grant had limited foreign policy experience, so relied heavily on his talented Secretary of State Hamilton Fish. Grant and Fish had cordial friendship. Besides Grant, the main players in foreign affairs were Fish and the chairman of the Senate Foreign Relations Committee Charles Sumner. Sumner, who hated Grant, led the opposition to Grant's plan to annex Santo Domingo, despite fully supporting annexation of Alaska.[343]
Grant had an expansionist impulse to protect American interests abroad and was a strong advocate of the Monroe Doctrine.[344] For instance, when Tomás Frías became President of Bolivia in 1872, Grant stressed the importance of maintaining good relations between Bolivia and the US.[345] He had an idealist side to his foreign policy. For instance, Grant appointed a Jewish lawyer, Benjamin F. Peixotto, U.S. Consul in Bucharest, in response to the Romanian persecution of Jews. Grant said that respect "for human rights is the first duty for those set as rulers" over the nations.[346]
Secretary of State Hamilton Fish and Grant successfully settled the Alabama Claims by treaty and arbitration.
The most pressing diplomatic problem in 1869 was the settlement of the Alabama Claims, depredations caused to Union merchant ships by the Confederate warship CSS Alabama, built in a British shipyard in violation of neutrality rules.[347] Fish played the central role in formulating and implementing the Treaty of Washington and the Geneva arbitration (1872).[348] Senator Charles Sumner led the demand for reparations, with talk of British Columbia as payment.[349] Sumner, among other politicians, argued that British complicity in arms delivery to the Confederacy via blockade runners prolonged the war.[350] Fish and Treasurer George Boutwell convinced Grant that peaceful relations with Britain were essential, and the two nations agreed to negotiate.[351]
To avoid jeopardizing negotiations, Grant refrained from recognizing Cuban rebels who were fighting for independence from Spain, which would have been inconsistent with American objections to the British granting belligerent status to Confederates.[n][327] A commission in Washington produced a treaty whereby an international tribunal would settle the damage amounts; the British admitted regret, but not fault.[352] The Senate, including Grant critics Sumner and Carl Schurz, approved the Treaty of Washington, which settled disputes over fishing rights and maritime boundaries.[353] The Alabama Claims settlement was Grant's most successful foreign policy achievement, securing peace with Great Britain.[354] The settlement ($15,500,000) of the Alabama claims resolved troubled Anglo-American issues and turned Britain into America's strongest ally.[355]
In 1871, a U.S. expedition was sent to Korea to open up trade with a country which had a policy that excluded trading with foreign powers, and to learn the fate of U.S. merchant ship SS General Sherman, which had disappeared up the Taedong River in 1866.[356] Grant dispatched a land and naval force consisting of five warships and over 1,200 men, under Admiral John Rodgers, to support a diplomatic delegation, led by US ambassador to China, Frederick Low, sent to negotiate trade and political relations.[356]
On June 1, the American ships entered the Ganghwa Straits on the Han River and, as foreign ships were barred from entering the river, onshore Korean garrisons fired upon the ships, but little damage was done. When Rodgers demanded an apology and to begin treaty negotiations, the Korean government refused.[357] On June 10, Rodgers destroyed several Korean forts, culminating in the Battle of Ganghwa, at which 250 Koreans were killed with a loss of 3 Americans.[357] The expedition failed to open up trade and merely strengthened Korea's isolationist policy.[358]
Santo Domingo City; watercolor by James E. Taylor, 1871
In 1869, Grant initiated his plan to annex the Dominican Republic, then called Santo Domingo.[359] Grant believed acquisition would increase the United States' natural resources, and strengthen U.S. naval protection to enforce the Monroe Doctrine, safeguard against British obstruction of U.S. shipping and protect a future oceanic canal, stop slavery in Cuba and Brazil, while black people in the United States would have a safe haven from "the crime of Klu Kluxism".[360]
Joseph W. Fabens, an American speculator who represented Buenaventura Báez, the president of the Dominican Republic, met with Secretary Fish and proposed annexation.[361] On July 17, Grant sent a military aide Orville E. Babcock to evaluate the islands' resources, local conditions, and Báez's terms for annexation, but gave him no diplomatic authority.[362] When Babcock returned to Washington with unauthorized annexation treaties, Grant pressured his cabinet to accept them.[363] Grant ordered Fish to draw up formal treaties, sent to Báez by Babcock's return to the island nation. The Dominican Republic would be annexed for $1.5 million and Samaná Bay would be lease-purchased for $2 million. Generals D.B. Sackett and Rufus Ingalls accompanied Babcock.[364] On November 29, President Báez signed the treaties. On December 21, the treaties were placed before Grant and his cabinet.[365]
Grant's plan, however, was obstructed by Senator Charles Sumner.[366] On December 31, Grant met with Sumner at Sumner's home to gain his support for annexation. Grant left confident that Sumner approved, but what Sumner actually said was disputed by various witnesses. Without appealing to the American public, Grant submitted the treaties on January 10, 1870, to the Senate Foreign Relations Committee, chaired by Sumner, for ratification, but Sumner shelved the bills.[367] Prompted by Grant to stop stalling the treaties, Sumner's committee took action but rejected the bills by a 5-to-2 vote. Sumner opposed annexation and reportedly said the Dominicans were "a turbulent, treacherous race" in a closed session of the Senate.[368] Sumner sent the treaties for a full Senate vote, while Grant personally lobbied other senators. Despite Grant's efforts, the Senate defeated the treaties.[369]
Grant was outraged, and on July 1, 1870, he sacked his appointed Minister to Great Britain, John Lothrop Motley, Sumner's friend and ally.[370] In January 1871, Grant signed a joint resolution to send a commission to investigate annexation.[371] He chose three neutral parties, with Frederick Douglass to be secretary of the commission, that gave Grant the moral high ground from Sumner.[372] Although the commission approved its findings, the Senate remained opposed, forcing Grant to abandon further efforts.[373] Seeking retribution, in March 1871, Grant maneuvered to have Sumner deposed from his powerful Senate chairmanship.[374] The stinging controversy over Santo Domingo overshadowed Grant's foreign diplomacy.[354] Critics complained of Grant's reliance on military personnel to implement his policies.[364]
American policy under Grant was to remain neutral during the Ten Years' War (1868–78) in Cuba against Spanish rule. On the recommendation of Fish and Sumner, Grant refused to recognize the rebels, in effect endorsing Spanish colonial rule, while calling for the abolition of slavery in Cuba.[375][376] This was done to protect American commerce and to keep peace with Spain.[376]
This fragile policy was broken in October 1873, when a Spanish cruiser captured a merchant ship, Virginius, flying the U.S. flag, carrying supplies and men to aid the insurrection. Treating them as pirates, Spanish authorities executed 53 prisoners without trial, including eight Americans. American Captain Joseph Frye and his crew were executed and their bodies mutilated. Enraged Americans called for war with Spain. Grant ordered U.S. Navy Squadron warships to converge on Cuba. On November 27, Fish reached a diplomatic resolution in which Spain's president, Emilio Castelar y Ripoll, expressed his regret, surrendered the Virginius and the surviving captives. Spain paid $80,000 to the families of the executed Americans.[377][378]
In the face of strong opposition from Democrats, Grant and Fish secured a free trade treaty in 1875 with Hawaii, incorporating its sugar industry into the U.S. economic sphere.[379] To secure the agreement, King Kalākaua made a 91-day state visit, the first reigning monarch to set foot in the United States.[380] Despite opposition from Southern Democrats, who wanted to protect American rice and sugar producers, and Democrats, who believed the treaty to be an island annexation attempt and referred to the Hawaiians as an "inferior" race, a bill implementing the treaty passed Congress.[381]
The treaty gave free access to the U.S. market for sugar and other products grown in Hawaii from September 1876. The U.S. gained lands in the area known as Puʻu Loa for what would become known as the Pearl Harbor naval base. The treaty led to large investment by Americans in sugar plantations in Hawaii.[382]
When Grant took office in 1869, the nation's more than 250,000 Native Americans were governed by 370 treaties.[383] Grant's faith influenced his "peace" policy, believing that the "Creator" did not place races of men on earth for the "stronger" to destroy the "weaker".[384] Grant was mostly an assimilationist, wanting Native Americans to adopt European customs, practices, and language, and accept democratic government, leading to eventual citizenship.[385][386] At Grant's 1869 Inauguration, Grant said "I will favor any course towards them which tends to their civilization, Christianization and ultimate citizenship."[386] Grant appointed Ely S. Parker, an assimilated Seneca and member of his wartime staff, as the Commissioner of Indian Affairs, the first Native American to serve in this position, surprising many.[387][386]
In April 1869, Grant signed legislation establishing an unpaid Board of Indian Commissioners to reduce corruption and oversee the implementation of his "Peace" policy,[388] aimed to replace entrepreneurs serving as Native American agents with missionaries and to protect Native Americans on reservations and educate them in farming.[389]
In 1870, a setback in Grant's policy occurred over the Marias Massacre, causing public outrage.[390] In 1871, Grant ended the sovereign tribal treaty system; by law individual Native Americans were deemed wards of the federal government.[391] Grant's policy was undermined by Parker's resignation in 1871, denominational infighting among religious agents, and entrenched economic interests.[392] Nonetheless, Indian wars declined overall during Grant's first term, and on October 1, 1872, Major General Oliver Otis Howard negotiated peace with the Apache leader Cochise.[393] On December 28, 1872, another setback took place when General George Crook and the 5th Cavalry massacred about 75 Yavapai Apache Indians at Skeleton Cave, Arizona.[394]
On April 11, 1873, Major General Edward Canby was killed in North California by Modoc leader Kintpuash.[395] Grant ordered restraint. The army captured Kintpuash and his followers, who were convicted of Canby's murder and hanged on October 3, while the remaining Modoc were relocated to the Indian Territory.[395] The beginning of the Indian Wars has been dated to this event.[396]
In 1874, the army defeated the Comanche at the Battle of Palo Duro Canyon, forcing them to settle at the Fort Sill reservation in 1875.[397] Grant pocket-vetoed a bill in 1874 protecting bison and instead supported Interior Secretary Columbus Delano, who correctly believed killing bison would force Plains Indians to abandon their nomadic lifestyle.[398] In April 1875, another setback occurred: the U.S. Army massacred 27 Cheyenne Indians in Kansas.[399]
With the lure of gold discovered in the Black Hills and the westward force of Manifest Destiny, white settlers trespassed on Sioux protected lands. Red Cloud reluctantly entered negotiations on May 26, 1875, but other Sioux chiefs readied for war.[400] Grant told the Sioux leaders to make "arrangements to allow white persons to go into the Black Hills" and that their children would attend schools, speak English, and prepare "for the life of white men."[385]
On November 3, 1875, under advice from Sheridan, Grant agreed not to enforce excluding miners from the Black Hills, forcing Native Americans onto the Sioux reservation.[401] Sheridan told Grant that the U.S. Army was undermanned and the territory involved was vast, requiring many soldiers.[402]
During the Great Sioux War that started after Sitting Bull refused to relocate to agency land, warriors led by Crazy Horse massacred George Armstrong Custer and his men at the Battle of the Little Big Horn. Angry white settlers demanded retribution. Grant castigated Custer in the press, saying "I regard Custer's massacre as a sacrifice of troops, brought on by Custer himself, that was wholly unnecessary."[403] In September and October 1876, Grant persuaded the tribes to relinquish the Black Hills. Congress ratified the agreement three days before Grant left office in 1877.[404]
In spite of Grant's peace efforts, over 200 battles were fought with Native Americans during his presidency. Grant's peace policy survived Custer's death, even after Grant left office in 1877; Indian policy remained under the Interior Department rather than the War Department.[405] The policy was considered humanitarian for its time but later criticized for disregarding tribal cultures.[406]
Cartoon by Thomas Nast on Grant's opponents in the reelection campaign
The Liberal Republicans—reformers, men who supported low tariffs, and those who opposed Grant's prosecution of the Klan—broke from Grant and the Republican Party.[407] The Liberals disliked Grant's alliance with Senators Simon Cameron and Roscoe Conkling, considered to be spoilsmen politicians.[408]
In 1872, the Liberals nominated Horace Greeley, a New York Tribune editor and enemy of Grant, for president, and Missouri governor B. Gratz Brown, for vice president.[409] The Liberals denounced Grantism, corruption, and inefficiency, and demanded withdrawal of federal troops from the South, literacy tests for black voters, and amnesty for Confederates.[410] The Democrats adopted the Greeley-Brown ticket and the Liberals party platform.[411] Greeley pushed the themes that the Grant administration was failed and corrupt.[412]
The Republicans nominated Grant for reelection, with Senator Henry Wilson of Massachusetts as the vice presidential nominee.[o][414] The Republicans shrewdly borrowed from the Liberal platform, including "extended amnesty, lowered tariffs, and embraced civil service reform."[415] Grant lowered customs duties, gave amnesty to Confederates, and implemented a civil service merit system, neutralizing the opposition.[416] To placate the burgeoning suffragist movement, the Republican platform said women's rights would be treated with "respectful consideration."[417] Concerning Southern policy, Greeley advocated that local government control be given to white people, while Grant advocated federal protection of black people.[418] Grant was supported by Frederick Douglass, prominent abolitionists, and Indian reformers.[419]
Grant won reelection easily thanks to federal prosecution of the Klan, a strong economy, debt reduction, and lowered tariffs and taxes.[420] He received 56% of the vote and an Electoral College landslide (286 to 66).[421][422] Most African Americans in the South voted for Grant, while Democratic opposition remained mostly peaceful.[423] Grant lost in six former slave states that wanted an end to Reconstruction.[424] He proclaimed the victory as a personal vindication, but felt betrayed by the Liberals.[425]
Grant was sworn in by Salmon P. Chase on March 4, 1873. In his second inaugural address, he focused on what he considered the chief issues: freedom and fairness for all Americans and the benefits of citizenship for freed slaves. Grant concluded his address: "My efforts in the future will be directed towards the restoration of good feelings between the different sections of our common community".[p][427] Wilson died in office on November 22, 1875.[428] With Wilson's loss, Grant relied on Fish's guidance more than ever.[429]
Grant signed the Coinage Act of 1873, effectively ending the legal basis for bimetallism.[430] The Coinage Act discontinued the standard silver dollar and established the gold dollar as the monetary standard; because the gold supply did not increase as quickly as the population, the result was deflation. Silverites, who wanted more money in circulation to raise the prices farmers received, denounced the move as the "Crime of 1873", claiming deflation made debts more burdensome for farmers.[431]
Grant is congratulated for vetoing the "inflation bill" in 1874.
Economic turmoil renewed during Grant's second term. In September 1873, Jay Cooke & Company, a New York brokerage house, collapsed after it failed to sell all the bonds issued by Northern Pacific Railway. Other banks and brokerages that owned railroad stocks and bonds were ruined.[432][433] Grant, who knew little about finance, traveled to New York to consult leading businessmen on how to resolve the crisis, which became known as the Panic of 1873.[434] Grant believed that, as with the collapse of the Gold Ring in 1869, the panic was merely an economic fluctuation.[435] He instructed the Treasury to buy $10 million in government bonds, which curbed the panic, but the Long Depression, swept the nation.[434] Eighty-nine of the nation's 364 railroads went bankrupt.[436]
In 1874, hoping inflation would stimulate the economy, Congress passed the Ferry Bill.[437] Many farmers and workingmen favored the bill, which would have added $64 million in greenbacks to circulation, but some Eastern bankers opposed it because it would have weakened the dollar.[438] Belknap, Williams, and Delano told Grant a veto would hurt Republicans in the November elections. Grant believed the bill would destroy the credit of the nation and vetoed it despite their objections. Grant's veto placed him in the Republican conservative faction and began the party's commitment to a gold-backed dollar.[439] Grant later pressured Congress for a bill to strengthen the dollar by gradually reducing the greenbacks in circulation. When the Democrats gained a majority in the House after the 1874 elections, the lame-duck Republican Congress did so before the Democrats took office.[440] On January 14, 1875, Grant signed the Specie Payment Resumption Act, which required reduction of greenbacks allowed to circulate and declared that they would be redeemed for gold beginning on January 1, 1879.[441]
The post-Civil War economy brought on massive industrial wealth and government expansion. Speculation, lifestyle extravagance, and corruption in federal offices were rampant.[442] All of Grant's executive departments were investigated by Congress.[443] Grant by nature was honest, trusting, gullible, and loyal to his friends. His responses to malfeasance were mixed: at times appointing cabinet reformers, others defending culprits.[444]
Cartoonist Thomas Nast praises Grant for rejecting demands by Pennsylvania politicians to suspend civil service rules.
Grant in his first term appointed Secretary of Interior Jacob D. Cox, who implemented civil service reform, including firing unqualified clerks.[445] On October 3, 1870, Cox resigned after a dispute with Grant over handling of a mining claim.[446] Authorized by Congress on March 3, 1871, Grant created and appointed the first Civil Service Commission.[447] Grant's Commission created rules for competitive exams for appointments, ending mandatory political assessments and classifying positions into grades.[448][q]
In November 1871, Grant's appointed New York Collector, Thomas Murphy, resigned. Grant replaced him with Chester A. Arthur, who implemented Boutwell's reforms.[450] A Senate committee investigated the New York Customs House in 1872. Previous Grant appointed collectors Murphy and Moses H. Grinnell charged lucrative fees for warehouse space, without the legal requirement of listing the goods.[451] This led to Grant firing warehouse owner George K. Leet, for pocketing the exorbitant freight fees.[452] Boutwell's reforms included stricter record-keeping and that goods be stored on company docks.[451] Grant ordered prosecutions by Attorney General George H. Williams and Secretary of Treasury Boutwell of persons accepting and paying bribes.[453]
On March 3, 1873, Grant signed into law an appropriation act that increased pay for federal employees, Congress (retroactive), the judiciary, and the president.[454][451] Grant's annual salary doubled to $50,000. Critics derided Congress' two-year retroactive $4,000 payment for each Congressman, and the law was partially repealed. Grant kept his much-needed pay raise, while his reputation remained intact.[455][451]
In 1872, Grant signed into law an act that ended private moiety (tax collection) contracts, but an attached rider allowed three more contracts.[456] Boutwell's assistant secretary William A. Richardson hired John B. Sanborn to go after "individuals and cooperations" who allegedly evaded taxes. Sanborn aggressively collected $213,000, while splitting $156,000 to others, including Richardson, and the Republican Party campaign committee.[457][451] During an 1874 Congressional investigation, Richardson denied involvement, but Sanborn said he met with Richardson over the contracts.[458] Congress severely condemned Richardson's permissive manner. Grant appointed Richardson judge of the Court of Claims, and replaced him with reformer Benjamin Bristow.[459] In June, Grant and Congress abolished the moiety system.[460]
Bristow tightened up the Treasury's investigation force, implemented civil service, and fired hundreds of corrupt appointees.[461] Bristow discovered Treasury receipts were low, and launched an investigation that uncovered the notorious Whiskey Ring, that involved collusion between distillers and Treasury officials to evade millions in taxes.[462][463] In mid-April, Bristow informed Grant of the ring. On May 10, Bristow struck hard and broke the ring.[464] Federal marshals raided 32 installations nationwide, leading to 110 convictions and $3,150,000 in fines.[465]
Harper's Weekly cartoon on Bristow's Whiskey Ring investigation
Grant appointed David Dyer, under Bristow's recommendation, federal attorney to prosecute the Ring in St. Louis, who indicted Grant's friend General John McDonald, supervisor of Internal Revenue.[466] Grant endorsed Bristow's investigation, writing on a letter "Let no guilty man escape..."[467] Bristow's investigation discovered Babcock received kickback payments, and that Babcock had secretly forewarned McDonald, the ring's mastermind, of the investigation.[468] On November 22, the jury convicted McDonald.[469] On December 9, Babcock was indicted; Grant refused to believe in Babcock's guilt and was ready to testify in Babcock's favor, but Fish warned that doing so would put Grant in the embarrassing position of testifying against a case prosecuted by his own administration.[470] Instead, on February 12, 1876, Grant gave a deposition in Babcock's defense, expressing that his confidence in his secretary was "unshaken".[471] Grant's testimony silenced all but his strongest critics.[472]
The St. Louis jury acquitted Babcock, and Grant allowed him to remain at the White House. However, after Babcock was indicted in a frame-up of a Washington reformer, called the Safe Burglary Conspiracy, Grant dismissed him. Babcock kept his position of Superintendent of Public Buildings in Washington.[473][451]
The Interior Department under Secretary Columbus Delano, whom Grant appointed to replace Cox, was rife with fraud and corruption. The exception was Delano's effective oversight of Yellowstone. Grant reluctantly forced Delano's resignation. Surveyor General Silas Reed had set up corrupt contracts that benefited Delano's son, John Delano.[474] Grant's Secretary of Interior Zachariah Chandler, who succeeded Delano in 1875, implemented reforms, fired corrupt agents and ended profiteering.[475] When Grant was informed by Postmaster Marshall Jewell of a potential Congressional investigation into an extortion scandal involving Attorney General George H. Williams' wife, Grant fired Williams and appointed reformer Edwards Pierrepont. Grant's new cabinet appointments temporarily appeased reformers.[476]
After the Democrats took control of the House in 1875, more corruption in federal departments was exposed.[477] Among the most damaging scandal involved Secretary of War William W. Belknap, who took quarterly kickbacks from the Fort Sill tradership; he resigned in February 1876.[478] Belknap was impeached by the House but was acquitted by the Senate.[479] Grant's brother Orvil set up "silent partnerships" and received kickbacks from four trading posts.[480] Congress discovered that Secretary of Navy Robeson had been bribed by a naval contractor, but no articles of impeachment were drawn up.[481] In his December 5, 1876, Annual Message, Grant apologized to the nation: "Failures have been errors of judgement, not of intent."[482]
The abandonment of Reconstruction played a central role during the 1876 election.[483] Mounting investigations into corruption by the House, controlled by the Democrats, discredited Grant's presidency.[484] Grant did not run for a third term, while the Republicans chose Governor Rutherford B. Hayes of Ohio, a reformer, at their convention.[485] The Democrats nominated Governor Samuel J. Tilden of New York. Voting irregularities in three Southern states caused the election to remain undecided for several months.[486][487]
Grant told Congress to settle the matter through legislation and assured both sides that he would not use the army to force a result, except to curb violence. On January 29, 1877, he signed legislation forming an Electoral Commission,[488] which ruled Hayes elected president; to forestall Democratic protests, Republicans agreed to the Compromise of 1877, in which the last troops were withdrawn from Southern capitals. With Reconstruction dead, 80 years of Jim Crow segregation was launched.[489] Grant's "calm visage" throughout the election crisis appeased the nation.[490]
After leaving the White House, Grant said he "was never so happy in my life". The Grants left Washington for New York, to attend the birth of their daughter Nellie's child. Calling themselves "waifs", the Grants toured Cincinnati, St. Louis, Chicago, and Galena, without a clear idea of where they would live.[491]
As a courtesy to Grant by the Hayes administration, his touring party received federal transportation on three U.S. Navy ships: a five-month tour of the Mediterranean on the USS Vandalia, travel from Hong Kong to China on the USS Ashuelot, and from China to Japan on the USS Richmond.[496] The Hayes administration encouraged Grant to assume a public unofficial diplomatic role and strengthen American interests abroad during the tour.[497] Homesick, the Grants left Japan on the SS City of Tokio and landed in San Francisco on September 20, 1879, greeted by cheering crowds.[498] Grant's tour demonstrated to Europe and Asia that the United States was an emerging world power.[499]
Cartoonist Joseph Keppler lampooned Grant and his associates. Puck, 1880
Politically conservative,[500] Grant was supported by the Stalwarts who, led by Grant's old political ally Roscoe Conkling, saw Grant's renewed popularity as an opportunity, and sought to nominate him for the presidency in 1880. Opponents called it a violation of the unofficial two-term rule in use since George Washington. Grant said nothing publicly but wanted the job and encouraged his men.[501] Washburne urged him to run; Grant demurred. Even so, Conkling and John A. Logan began to organize delegates in Grant's favor. When the convention convened in Chicago in June, more delegates were pledged to Grant than to any other candidate, but he was still short of a majority vote.[502]
At the convention, Conkling nominated Grant with an eloquent speech, the most famous line being "When asked which state he hails from, our sole reply shall be, he hails from Appomattox and its famous apple tree."[502] With 378 votes needed for the nomination, the first ballot had Grant at 304, Blaine at 284, Sherman at 93, and the rest to minor candidates.[503] After thirty-six ballots, Blaine's delegates combined with those of other candidates to nominate a compromise candidate: James A. Garfield.[504] A procedural motion made the vote unanimous for Garfield.[505] Grant gave speeches for Garfield but declined to criticize the Democratic nominee, Winfield Scott Hancock, a general who had served under him.[506] Garfield won the election. Grant gave Garfield his public support and pushed him to include Stalwarts in his administration.[507] On July 2, 1881, Garfield was shot by an assassin and died on September 19. On learning of Garfield's death from a reporter, Grant wept.[508]
In the 19th century, there were no federal presidential pensions, and the Grants' personal income was $6,000 a year.[509] Grant's world tour had been costly, and he had depleted most of his savings.[510] Wealthy friends bought him a house on Manhattan's Upper East Side, and to make an income, Grant, Jay Gould, and former Mexican Finance Secretary Matías Romero chartered the Mexican Southern Railroad, with plans to build a railroad from Oaxaca to Mexico City. Grant urged President Chester A. Arthur to negotiate a free trade treaty with Mexico. Arthur and the Mexican government agreed, but the United States Senate rejected the treaty in 1883. The railroad was similarly unsuccessful, falling into bankruptcy the following year.[511]
At the same time, Grant's son Buck had opened a Wall Street brokerage house with Ferdinand Ward. A conniving man who swindled numerous wealthy men, Ward was at the time regarded as a rising star on Wall Street. The firm, Grant & Ward, was initially successful.[512] In 1883, Grant joined the firm and invested $100,000 (~$2.78 million in 2023) of his own money.[513] Ward paid investors abnormally high interest by pledging the company's securities on multiple loans in a process called rehypothecation (now regarded as a Ponzi scheme).[514] Ward, in collusion with banker James D. Fish and kept secret from bank examiners, retrieved the firm's securities from the company's bank vault.[515] When the trades went bad, multiple loans came due, all backed by the same collateral.[516]
Historians agree that the elder Grant was likely unaware of Ward's intentions, but it is unclear how much Buck Grant knew. In May 1884, enough investments went bad to convince Ward that the firm would soon be bankrupt. Ward, who assumed Grant was "a child in business matters",[517] told him of the impending failure, but assured Grant that this was a temporary shortfall.[518] Grant approached businessman William Henry Vanderbilt, who gave him a personal loan of $150,000.[519] Grant invested the money in the firm, but it was not enough to save it. The fall of Grant & Ward set off the Panic of 1884.[516]
Vanderbilt offered to forgive Grant's debt entirely, but Grant refused.[520] Impoverished but compelled by personal honor, he repaid what he could with his Civil War mementos and the sale or transfer of all other assets.[521] Vanderbilt took title to Grant's home, although he allowed the Grants to continue to reside there, and pledged to donate the souvenirs to the federal government and insisted the debt had been paid in full.[522] Grant was distraught over Ward's deception and asked privately how he could ever "trust any human being again."[523] In March 1885, he testified against both Ward and Fish.[524] After the collapse of Grant & Ward, there was an outpouring of sympathy for Grant.[525]
Grant working on his memoirs, less than a month before his death
Grant attended a service for Civil War veterans in Ocean Grove, New Jersey, on August 4, 1884, receiving a standing ovation from the ten thousand attendees; it would be his last public appearance.[526] In the summer of 1884, Grant complained of a sore throat but put off seeing a doctor until late October, when he learned it was cancer, possibly caused by his frequent cigar smoking.[527] Grant chose not to reveal the seriousness of his condition to his wife, who soon found out from Grant's doctor.[528] In March 1885, The New York Times announced that Grant was dying of cancer, causing nationwide public concern.[529][530] Knowing of Grant and Julia's financial difficulties, Congress restored him to the rank of General of the Army with full retirement pay—Grant's assumption of the presidency had required that he resign his commission and forfeit his (and his widow's) pension.[531]
Grant was nearly penniless and worried about leaving his wife money to live on. He approached The Century Magazine and wrote a number of articles on his Civil War campaigns for $500 (equivalent to $17,000 in 2024) each. The articles were well received by critics, and the editor, Robert Underwood Johnson, suggested that Grant write a memoir, as Sherman and others had done.[532] The magazine offered him a book contract with a 10% royalty. However, Grant's friend Mark Twain, one of the few who understood Grant's precarious financial condition, offered him an unheard-of 70% royalty.[516] To provide for his family, Grant worked intensely on his memoirs in New York City. His former staff member Adam Badeau assisted with the research, while his son Frederick located documents and did much of the fact-checking.[533] Because of the summer heat and humidity, his doctors recommended that he move upstate to a cottage at the top of Mount McGregor, offered by a family friend.[534]
On July 18, 1885, Grant finished his memoir,[535] which includes the events of his life to the end of the Civil War.[536] The Personal Memoirs of U. S. Grant was a critical and commercial success. Julia Grant eventually received about $450,000 in royalties (equivalent to $15,700,000 in 2024). The memoir has been highly regarded by the public, military historians, and literary critics.[516] Grant portrayed himself as an honorable Western hero, whose strength lies in his honesty. He candidly depicted his battles against both the Confederates and internal army foes.[537]
Grant's funeral train at West Point
Grant died in the Mount McGregor cottage on July 23, 1885.[538] Sheridan, then Commanding General of the Army, ordered a day-long tribute to Grant on all military posts, and President Grover Cleveland ordered a thirty-day nationwide period of mourning. After private services, the honor guard placed Grant's body on a funeral train, which traveled to West Point and New York City. A quarter of a million people viewed it in the two days before the funeral.[516] Tens of thousands of men, many of them veterans from the Grand Army of the Republic (GAR), marched with Grant's casket drawn by two dozen black stallions to Riverside Park in Morningside Heights, Manhattan.[539] His pallbearers included Union generals Sherman and Sheridan, Confederate generals Simon Bolivar Buckner and Joseph E. Johnston, Admiral David Dixon Porter, and Senator John A. Logan, the head of the GAR.[540] Following the casket in the seven-mile-long (11 km) procession were President Cleveland, two former living presidents Hayes and Arthur, all of the president's cabinet, and justices of the Supreme Court.[541]
Attendance at the New York funeral topped 1.5 million.[542] Ceremonies were held in other major cities around the country, while Grant was eulogized in the press.[543] Grant's body was laid to rest in Riverside Park, first in a temporary tomb, and then on April 17, 1897, in the General Grant National Memorial, known as "Grant's Tomb", the largest mausoleum in North America.[540]
Grant was hailed across the North as the General who "saved the Union" and overall his military reputation has held up well. Achieving great national fame for his victories at Vicksburg and the surrender at Appomattox, Grant was the most successful general, Union or Confederate, in the American Civil War.[544] He was criticized by the South for using excessive force,[545] and his drinking was often exaggerated by the press and stereotyped by rivals and critics.[546] Historians also debate how effective Grant was at halting corruption.[547] The scandals during his administration stigmatized his political reputation.[548] Despite his administration's scandals, Grant was still respected by most of the nation at the time of his death, as can be indicated by the praise from Democratic president Cleveland and even some former Confederate generals, two of which had served as his pallbearers.[549]
However, Grant's reputation would decline soon after his death. During the late 19th and early 20th centuries, Grant's reputation was damaged by the "Lost Cause" movement and the Dunning School.[550] Grant's reputation particularly fell in the late 1910s and early 1920s because the US deaths in World War I brought back memories of Union deaths in Virginia in 1864, and the scandals of the Warren Harding administration brought back memories of the Grant administration's scandals.[551][552] Views of Grant reached new lows as he was seen as an unsuccessful president and an unskilled, if lucky, general.[553] In the 1950s, some historians reassessed Grant's military career, shifting the analysis of Grant as the victor by brute force to that of skillful modern strategist and commander.[554] Historian William S. McFeely's biography, Grant (1981), won the Pulitzer Prize, and brought renewed scholarly interest in Grant. McFeely believed Grant was an "ordinary American" trying to "make his mark" during the 19th century.[555] In the 21st century, Grant's reputation improved markedly among historians after the publication of Grant (2001), by historian Jean Edward Smith.[556][557] Opinions of Grant's presidency demonstrate a better appreciation of Grant's personal integrity, Reconstruction efforts, and peace policy towards Indians, even when they fell short.[558][559]H. W. Brands' The Man Who Saved the Union (2012), Ronald C. White's American Ulysses (2016), and Ron Chernow's Grant (2017) continued the elevation of Grant's reputation.[560] White said that Grant "demonstrated a distinctive sense of humility, moral courage, and determination", and as president he "stood up for African Americans, especially fighting against voter suppression perpetrated by the Ku Klux Klan".[561] White believed that Grant was "an exceptional person and leader".[562] Historian Robert Farley writes that the "Cult of Lee" and the Dunning School's resentment of Grant for his defeat of Lee and his strong enforcement of Reconstruction resulted in Grant's shoddy treatment by historians.[563]
In a 2021 C-SPAN survey ranking presidents from worst to best, Grant was ranked 20 out of 44 presidents, up from his previous ranking of 33 in 2017. This was due to the rehabilitation of his image and legacy in recent years, with Grant now receiving "more credit for Reconstruction and his diplomacy than condemnation for his alleged corruption."[564]
^One source states Hamer took the "S" from Simpson, Grant's mother's maiden name.[15] According to Grant, the "S." did not stand for anything. Upon graduation from the academy he adopted the name "Ulysses S. Grant".[16] Another version of the story states that Grant inverted his first and middle names to register at West Point as "Ulysses Hiram Grant" as he thought reporting to the academy with a trunk that carried the initials H.U.G. would subject him to teasing and ridicule. Upon finding that Hamer had nominated him as "Ulysses S. Grant." Grant decided to keep the name so that he could avoid the "hug" monogram; and it was easier to keep the wrong name than to try changing school records.[17]
^At the time, class ranking largely determined branch assignments. Those at the top of the class were usually assigned to the Engineers, followed by Artillery, Cavalry, and Infantry.[28]
^Several scholars, including Jean Edward Smith and Ron Chernow, state that Longstreet was Grant's best man and the two other officers were Grant's groomsmen.[33] All three went on to serve in the Confederate Army and surrendered to Grant at Appomattox.[34]
^William McFeely said that Grant left the army simply because he was "profoundly depressed" and that the evidence as to how much and how often Grant drank remains elusive.[65] Jean Edward Smith maintains Grant's resignation was too sudden to be a calculated decision.[66] Buchanan never mentioned it again until asked about it during the Civil War.[67] The effects and extent of Grant's drinking on his military and public career are debated by historians.[68] Lyle Dorsett said Grant was an "alcoholic" but functioned amazingly well. William Farina maintains Grant's devotion to family kept him from drinking to excess and sinking into debt.[69]
^The April 6th fighting had been costly, with thousands of casualties. That evening, heavy rain set in. Sherman found Grant standing alone under a tree in the rain. "Well, Grant, we've had the devil's own day of it, haven't we?" Sherman said. "Yes," replied Grant. "Lick 'em tomorrow, though."[122]
^Smuggling of cotton was rampant, while the price of cotton skyrocketed.[153] Grant believed the smuggling funded the Confederacy and provided them with military intelligence.[154]
^In 2012, historian Jonathan D. Sarna said: "Gen. Ulysses S. Grant issued the most notorious anti-Jewish official order in American history."[158] Grant made amends with the Jewish community during his presidency, appointing them to various federal positions.[159] In 2017, biographer Ron Chernow said of Grant: "As we shall see, Grant as president atoned for his action in a multitude of meaningful ways. He was never a bigoted, hate-filled man and was haunted by his terrible action for the rest of his days."[160]
^Attending Lincoln's funeral on April 19, Grant stood alone and wept openly; he later said Lincoln was "the greatest man I have ever known".[235]
^Southern Reconstructed states were controlled locally by Republican carpetbaggers, scalawags and former slaves. By 1877, the conservative Democrats had full control of the region and Reconstruction was dead.[300]
^To placate the South in 1870, Grant signed the Amnesty Act, which restored political rights to former Confederates.[309]
^Additionally, Grant's Postmaster General, John Creswell used his patronage powers to integrate the postal system and appointed a record number of African-American men and women as postal workers across the nation, while also expanding many of the mail routes.[311][312] Grant appointed Republican abolitionist and champion of black education Hugh Lennox Bond as U.S. Circuit Court judge.[313]
^An 1870 Congressional investigation chaired by James A. Garfield cleared Grant of profiteering, but excoriated Gould and Fisk for their manipulation of the gold market and Corbin for exploiting his personal connection to Grant.[341]
^Urged by his Secretary of War Rawlins, Grant initially supported recognition of Cuban belligerency, but Rawlins's death on September 6, 1869, removed any cabinet support for military intervention.[327]
^Details revealed of the 1867 Crédit Mobilier bribery scandal, implicating both Colfax and Wilson, stung the Grant administration, but Grant was not connected to the corruption.[413]
^The day after his inauguration, Grant wrote a letter to Colfax expressing his faith and trust in Colfax's integrity and allowed him to publish the letter, but the effort only served to compromise Grant's reputation.[426]
^When Congress failed to make the Commission's reform rules permanent, Grant dissolved the Commission in 1874.[449]
^Peters, Gerhard; Woolley, John T. (2018a). "Republican Party Platform of 1868". The American Presidency Project. University of California, Santa Barbara.
^Peters, Gerhard; Woolley, John T. (2018b). "Democratic Party Platform of 1868". The American Presidency Project. University of California, Santa Barbara.
Carpenter, Daniel P. (2001). "Chapter Three". The Forging of Bureaucratic Autonomy: Reputations, Networks, and Policy Innovation in Executive Agencies, 1862–1928. Princeton University Press. pp. 84–85. ISBN978-0-691-07009-4. OCLC47120319. Retrieved April 1, 2010.
Coffey, David (2011). Spencer C. Tucker (ed.). The Encyclopedia of North American Indian Wars, 1607–1890: A Political, Social, and Military History. Vol. 1. ABC-CLIO. ISBN978-1-85109-697-8.
Rafuse, Ethan S. (July 2007). "Still a Mystery? General Grant and the Historians, 1981–2006". Journal of Military History. 71 (3): 849–874. doi:10.1353/jmh.2007.0230. S2CID159901226.
Russell, Henry M. W. (Spring 1990). "The Memoirs of Ulysses S. Grant: The Rhetoric of Judgment". Virginia Quarterly Review. 66 (2): 189–209. ISSN0042-675X.
Simon, John Y. (1965). "From Galena to Appomattox: Grant and Washburne". Journal of the Illinois State Historical Society. 58 (2): 165–189. JSTORi40006018.
Reeves, John (2023). Soldier of Destiny: Slavery, Secession, and the Redemption of Ulysses S. Grant. Simon & Schuster. ISBN978-1-63936-528-9. Focus on 1860–1861.
Simpson, Brooks D. (1991). Let Us Have Peace: Ulysses S. Grant and the Politics of War and Reconstruction, 1861-1868. The University of North Carolina Press. ISBN978-0807819661.
Miley Ray Cyrus (/ˈmaɪliˈsaɪrəs/MY-lee SY-rəs, born Destiny Hope Cyrus; November 23, 1992) is an American singer, songwriter, and actress. Regarded as a pop icon, Cyrus has been recognized for her evolving artistry and image reinventions. She is a daughter of singer Billy Ray Cyrus, and deemed one of the few examples of a child star with a successful entertainment career as an adult. Cyrus emerged as a teen idol as the title character in the Disney Channel television series Hannah Montana (2006–2011), which evolved into a commercially successful franchise. As Hannah Montana, she achieved success on the Billboard charts with two number-one soundtracks.
Destiny Hope Cyrus was born November 23, 1992, in Franklin, Tennessee,[2] to Leticia "Tish" Jean Cyrus (née Finley) and country singer Billy Ray Cyrus.[2] She was born with supraventricular tachycardia, a condition causing an abnormal resting heart rate.[3] Her birth name, Destiny Hope, expressed her parents' belief that she would accomplish great things. Her parents nicknamed her "Smiley", which they later shortened to "Miley", because she often smiled as an infant.[4] In 2008, she legally changed her name to Miley Ray Cyrus; her middle name honors her grandfather, Democratic politician Ronald Ray Cyrus, who was from Kentucky.[5] Cyrus's godmother is singer-songwriter Dolly Parton.[6]
Against the advice of her father's record company,[7] Cyrus's parents secretly married on December 28, 1993, a year after her birth.[8] They had two more children, son Braison and daughter Noah.[9] From a previous relationship, her mother has two other children, Brandi and Trace.[10] Her father's first child, Christopher Cody, was born in April 1992[8] and grew up separately with his mother, waitress Kristin Luckey, in South Carolina.[7][11]
All of Cyrus's maternal siblings are established entertainers. Trace is a vocalist and guitarist for the electronic pop band Metro Station.[12] Noah is an actress and, along with Braison, models, sings, and is a songwriter.[13][14][15][16][17] Brandi was formerly a musician for the indie rock band Frank + Derol[18][19] and is a professional DJ. The Cyrus farmhouse is located on 500 acres of land outside Nashville.[20]
Cyrus attended Heritage Elementary School in Williamson County while she and her family lived in Thompson's Station, Tennessee.[21] When she was cast in Hannah Montana, the family moved to Los Angeles and she attended Options for Youth Charter Schools[22] studying with a private tutor on set.[23] Raised as a Christian, she was baptized in a Southern Baptist church before moving to Hollywood in 2005.[24] She attended church regularly while growing up and wore a purity ring.[25] In 2001, when Cyrus was eight, she and her family moved to Toronto, Canada, while her father filmed the television series Doc.[26] After Billy Ray Cyrus took her to see a 2001 Mirvish production of Mamma Mia! at the Royal Alexandra Theatre, Miley Cyrus grabbed his arm and told him, "This is what I want to do, daddy. I want to be an actress."[27] She began to take singing and acting lessons at the Armstrong Acting Studio in Toronto.[28]
Cyrus's first acting role was as Kylie in her father's television series Doc.[4] In 2003, she received credit under her birth name for her role as "Young Ruthie" in Tim Burton's Big Fish.[29] During this period she auditioned with Taylor Lautner for the feature film The Adventures of Sharkboy and Lavagirl in 3-D. Although she was one of two finalists for the role, she chose to appear in Hannah Montana instead.[30] Her mother took on the role of Miley's manager and worked to acquire a team to build her daughter's career.[31][32] Cyrus signed with Mitchell Gossett, director of the youth division at Cunningham Escott Slevin Doherty.[33] Gossett is often credited with "discovering" Cyrus and played a key role in her auditioning for Hannah Montana.[34] She later signed with Jason Morey of Morey Management Group to handle her music career; Dolly Parton steered her to him.[32] She hired her father's finance manager as part of her team.[32]
Cyrus auditioned for the Disney Channel television series Hannah Montana when she was thirteen years old.[35] She auditioned for the role of the title character's best friend, but was called to audition for the lead role instead seeing her comical performance.[35] Despite being denied the part at first because she was "too small and too young" for the role,[36] she was later cast as the lead because of her singing and goofy acting abilities.[37] The series premiered in March 2006 to the largest audience for a Disney Channel program[38] and quickly ranked among the highest-rated series on basic cable.[39] The success of the series led to Cyrus' being labeled a "teen idol."[29][40] She toured with the Cheetah Girls as Hannah Montana in September 2006 and performed songs from the show's first season.[41]Walt Disney Records released a soundtrack credited to Cyrus's character in October of that year.[42] The record was both a critical and commercial success, topping the Billboard 200 chart in the United States; it went on to sell over threemillion copies worldwide.[43] With the release of the soundtrack, Cyrus became the first act within the Walt Disney Company to have deals in television, film, consumer products, and music.[40]
Cyrus signed a four-album deal with Hollywood Records to distribute her non-Hannah Montana soundtrack music.[44] She released the two-disc album Hannah Montana 2: Meet Miley Cyrus in June 2007.[45] The first disc was credited as the second soundtrack by "Hannah Montana", while the second disc served as Cyrus's debut studio album.[45] The album became her second to reach the top of the Billboard 200, and has sold over threemillion copies.[46] Months after the release of the project, "See You Again" (2007) was released as the lead single from the album.[47] The song was a commercial success, and has sold over twomillion copies in the United States since its release.[48] She collaborated with her father on the single "Ready, Set, Don't Go" (2007).[49] Next Cyrus embarked on her highly successful Best of Both Worlds Tour (2007–08) to promote its release.[50][51]Ticketmaster officials commented that "there [hadn't] been a demand of this level or intensity since The Beatles or Elvis."[52] The tour's success led to the theatrical release of the 3D concert film Hannah Montana & Miley Cyrus: Best of Both Worlds Concert (2008).[53] While initially intended to be a limited release, the film's success led to a longer run.[54]
Cyrus and friend Mandy Jiroux began posting videos on YouTube in February 2008, referring to the clips as "The Miley and Mandy Show"; the videos garnered a large online following.[55] In April 2008, several pictures of Cyrus in her underwear and swimsuit were leaked online by a teenager who hacked her Gmail account.[56][57] Further controversy erupted when it was reported that the then-15-year-old Cyrus had posed topless during a photo shoot by Annie Leibovitz for Vanity Fair.[58]The New York Times subsequently clarified that although the shot left the impression that Cyrus was bare-breasted, she was wrapped in a bed sheet and was not topless.[59]
Cyrus went on to release her second studio album, Breakout (2008), in June of that year.[60] The album earned the highest first-week sales of her career thus far and became her third to top the Billboard 200.[61][62] Cyrus later starred with John Travolta in the animated film Bolt (2008), her debut as a film actress; she also co-wrote the song "I Thought I Lost You" (2008) for the film, which she sings as a duet with Travolta.[63] The film was both a critical and commercial success and earned her a Golden Globe Award nomination for Best Original Song.[64]
In March 2009, Cyrus released "The Climb" (2009) as a single from the soundtrack to the Hannah Montana feature film.[65] It was met with a warm critical and commercial reaction, becoming a crossover hit in both pop and country music formats.[66] The soundtrack, which features the single, went on to become Cyrus's fourth entry to top the Billboard 200; at age 16, she became the youngest artist in history to have four number-one albums on the chart.[67] She released her fourth soundtrack as Hannah Montana in July 2009, which debuted at number two on the Billboard 200.[68] Cyrus later launched her first fashion line, Miley Cyrus and Max Azria, through Walmart.[69] It was promoted by the release of "Party in the U.S.A." (2009) and the EPThe Time of Our Lives (2009).[70][71] Cyrus said the record was "a transitioning album [...] really to introduce people to what I want my next record to sound like and with time I will be able to do that a little more."[71] "Party in the U.S.A." became one of Cyrus's most successful singles to date and is considered to be one of her signature songs.[72] She embarked on her first world tour, the Wonder World Tour (2009) which was a critical and commercial success.[73] On December 7, 2009, Cyrus performed for Queen Elizabeth II and other members of the British royal family at the Royal Variety Performance in Blackpool, Lancashire.[74]Billboardranked her as the fourth best-selling female music artist of 2009.[75]
2010–2012: New image with Can't Be Tamed and focus on acting
Hoping to foster a more mature image, Cyrus starred in the film The Last Song (2010), based on the Nicholas Sparks novel.[76] It was met with negative critical reviews[77] but was a box office hit.[78][79] Cyrus further attempted to shift her image with the release of her third studio album, Can't Be Tamed (2010).[80] The album featured a more dance-oriented sound than her prior releases and stirred a considerable amount of controversy over its lyrical content and Cyrus's live performances.[81][82][83][84] It sold 106,000 copies in its first week of release and became her first studio album not to top the Billboard 200 chart in the United States.[85] Cyrus released her final soundtrack as Hannah Montana that October; it was seen as a commercial failure due to its low position on the charts compared to her previous albums.[86]
Cyrus was the subject of further controversy when a video posted online in December 2010 showed her, then aged eighteen, smoking salvia with a bong.[87][88][89] 2010 ended with her ranking at number thirteen on the Forbes Celebrity 100 list.[90] She embarked on her worldwide Gypsy Heart Tour in April 2011 which had no North American dates;[91] she cited her various controversial moments as the reason, claiming that she only wanted to travel where she felt "the most love".[92][93] Following the release of Can't Be Tamed, Cyrus officially parted ways with Hollywood Records.[94] With her obligations to Hannah Montana fulfilled, Cyrus announced her plans to take a hiatus from music so she could focus on her acting career.[95] She confirmed she would not be going to college.[96][97]
Cyrus hosted the March 5, 2011, episode of Saturday Night Live where she poked fun at her recent controversies.[98][99] That November it was announced that Cyrus would be the voice of Mavis in the animated film Hotel Transylvania;[100] however by February 2012 she was dropped from the project and replaced with Selena Gomez. At the time Cyrus said her reason for leaving the movie was wanting to work on her music;[101] later it was revealed the real reason behind her exit was because she bought her then-boyfriend Liam Hemsworth a birthday cake shaped like a penis and licked it.[102] She made an appearance on the MTV television series Punk'd with Kelly Osbourne and Khloé Kardashian.[103][104] Cyrus starred alongside Demi Moore in the independent film LOL (2012).[105] The film had a limited release; it was a critical and commercial failure.[106][107][108] She starred in the comedy film So Undercover playing the role of an undercover FBI agent at a college sorority.[109]
Cyrus released a string of live performances known as the Backyard Sessions on YouTube during the spring and summer of 2012; the performances were of classic songs she personally liked.[110] Having begun working on a failed fourth album the previous year, Cyrus resumed working on a new musical project in late 2012.[111][failed verification] She collaborated with producers Rock Mafia on their song "Morning Sun" (2012), which was made available for free download online.[112] She had previously appeared in the music video for their debut single, "The Big Bang" (2010).[113] Cyrus later provided guest vocals on "Decisions" (2012) by Borgore.[114] Both Cyrus and Hemsworth appeared in the song's music video.[115] She went on to guest star as Missi in two episodes of the CBS sitcom Two and a Half Men.[116] Cyrus drew significant media attention when she cut her traditionally long, brown hair in favor of a blonde, pixie cut; she commented that she had "never felt more [herself] in [her] whole life" and that "it really changed [her] life."[117][118]
2013–2015: Bangerz and Miley Cyrus & Her Dead Petz
In 2013, Cyrus hired Larry Rudolph to be her manager, best known for previously representing Britney Spears.[119][120] It was confirmed that Cyrus had signed with RCA Records for her future releases.[121] She worked with producers such as Pharrell Williams and Mike Will Made-It on her fourth studio album, resulting in a hip hop-influenced sound.[122] She collaborated with numerous hip hop artists releases[122] and appeared on the Snoop Lion song "Ashtrays and Heartbreaks" (2013), released as the lead single from his twelfth studio album, Reincarnated.[123] She collaborated with will.i.am on the song "Fall Down" (2013), released as a promotional single that same month.[124] The song entered the Billboard Hot 100 at number fifty-eight, marking her first appearance on the chart since "Can't Be Tamed" (2010).[125] She provided guest vocals on the Lil Twist song "Twerk", which also featured vocals by Justin Bieber.[126] The song was unreleased for unknown reasons but leaked online.[126] On May 23, 2013, it was confirmed that Cyrus would be featured on the Mike Will Made It single "23", with Wiz Khalifa and Juicy J.[127] The single went on to peak at number eleven on the Hot 100, and had sold over onemillion copies worldwide as of 2013.[128]
Cyrus released her new single "We Can't Stop" on June 3.[129] Touted as her comeback single, it became a worldwide commercial success, topping charts in territories such as the United Kingdom.[130][131] The song's music video set the Vevo record for most views within twenty-four hours of release and became the first to reach 100million views on the site.[132] Cyrus performed with Robin Thicke at the 2013 MTV Video Music Awards, a performance that resulted in widespread media attention and public scrutiny. Her simulated sex acts with a foam finger were described as "disturbing" and the whole performance as "cringe-worthy".[133][134] Cyrus released "Wrecking Ball" (2013) as the second single from Bangerz on the same day as the VMAs.[135] The accompanying music video, which showed her swinging naked on a wrecking ball, was viewed over nineteen million times within 24 hours of its release, and drew criticism from some for allegedly objectifying Cyrus, including fellow singer Sinéad O’Connor, who said that "you will obscure your talent by allowing yourself to be pimped, whether it's the music business or yourself doing the pimping".[135][134][136] Despite this, the single became Cyrus's first to top the Hot 100 in the US, and maintained the number-one spot for three weeks.[137] It sold over two million copies.[138]
On October 2, 2013, MTV aired the documentary Miley: The Movement, that chronicled the recording of her fourth studio album Bangerz,[139][140] which was released on October 4.[141] The album was a commercial success, debuting at number one on the Billboard 200 with first week sales of 270,000 copies.[142] On October 5, Cyrus hosted Saturday Night Live for the second time.[143] On November 5, Cyrus featured on rapper Future's "Real and True" with Mr. Hudson; an accompanying music video premiered five days later on November 10, 2013.[144] In late 2013 she was declared Artist of the Year by MTV.[145]
On January 29, 2014, she played an acoustic concert show on MTV Unplugged, performing songs from Bangerz featuring a guest appearance by Madonna.[146] It became the highest-rated MTV Unplugged in the past decade, with over 1.7million streams.[147] Cyrus was also featured in the Marc Jacobs Spring 2014 campaign along with Natalie Westling and Esmerelda Seay Reynolds.[148] She launched her controversial Bangerz Tour (2014) that year, which was positively received by critics.[149][150] Two months into her tour, Cyrus's Alaskan Klee Kai was found mauled to death at her home after fighting with a coyote. Two weeks later, Cyrus suffered an allergic reaction to the antibioticcephalexin, prescribed to treat a sinus infection,[151] resulting in her hospitalization in Kansas City. Though she rescheduled some of her US tour dates, she resumed the tour two weeks later, beginning with the European leg.[152]
Reports began to surface in 2015 that Cyrus was working on two albums simultaneously, one of which she hoped to release at no charge.[159] This was confirmed by her manager who claimed she was willing to end her contract with RCA Records if they refused to let her release a free album.[159] Cyrus was the host of the 2015 MTV Video Music Awards, making her its first openly pansexual host, and gave a surprise performance of a new song "Dooo It!" (2015) during the show's finale.[160][161] Immediately following the performance, Cyrus announced that her fifth studio album, Miley Cyrus & Her Dead Petz (2015), was available for free streaming on SoundCloud.[161] The album was written and produced primarily by Cyrus, and has been called experimental and psychedelic,[162][163][164] with elements of psychedelic pop,[165][166]psychedelic rock,[167] and alternative pop.[168]
In 2016, following the release of her fifth studio album the previous year, Cyrus resumed working on her sixth studio effort.[169][170] She was a key advisor during the tenth season of the reality singing competition The Voice.[171] In March, Cyrus had signed on as a coach for the eleventh season of The Voice as a replacement for Gwen Stefani; Cyrus became the youngest coach to appear in any incarnation of the series.[172] In September 2016, Cyrus co-starred in Crisis in Six Scenes, a television series Woody Allen created for Amazon Studios. She played a radical activist who causes chaos in a conservative 1960s household while hiding from the police.[173][174] On September 17, 2016, she appeared on The Tonight Show Starring Jimmy Fallon and covered Bob Dylan's "Baby, I'm In the Mood for You".[175] Cyrus also had an uncredited voice cameo as Mainframe in the superhero film Guardians of the Galaxy Vol. 2, released in May 2017.
On May 11, 2017, Cyrus released "Malibu" as the lead single from her sixth album.[176] The single debuted at No. 64 on the Billboard Hot 100 and peaked at No. 10 on the chart on its second week.[177] On June 9, Cyrus released "Inspired" after performing the song at the One Love Manchester benefit concert.[178] It served as a promotional single from the album. On August 8, Cyrus announced that her sixth studio album would be titled Younger Now and would be released on September 29, 2017.[179][180] The album's title track was released as the second single from the album on August 18 and debuted and peaked at No. 79 on the Billboard Hot 100.[181] On August 27, Cyrus performed the track at the 2017 MTV Video Music Awards.[182] On September 15, she performed "Malibu", "Younger Now", "See You Again", "Party in the U.S.A." and a cover of the Roberta Flack hit "The First Time Ever I Saw Your Face" (written by Ewan McColl) for the BBC Radio 1Live Lounge.[183] On October 2, as part of her one-week regular musical appearances on The Tonight Show Starring Jimmy Fallon, Cyrus sang her 2009 hit single "The Climb" for the first time since 2011 alongside a cover of "No Freedom" by Dido to honor the victims of the Las Vegas shooting.[184] The former song has since been performed at multiple charity events, protests, and marches, including at the March For Our Lives demonstrations in Washington, D.C.[185] That same year, Cyrus returned as a coach in the thirteenth season of The Voice after taking a one-season hiatus.[186][187] On October 5, 2017, Cyrus confirmed that she would not be returning to The Voice for season fourteen.[188] On October 30, 2017, Cyrus revealed she would not release any further singles from Younger Now, nor would she tour for it.[189]
Before the release of Younger Now in September 2017, Cyrus expressed she was "already two songs deep on the next [album]."[190] Producers attached to her seventh studio album included previous collaborator Mike Will Made It and new collaborators Mark Ronson and Andrew Wyatt.[191] Her first collaboration with Ronson, "Nothing Breaks Like a Heart" from his 2019 album Late Night Feelings, was released on November 29, 2018, to great commercial reception, especially in Europe, where it peaked at number two on the UK Singles Chart as well as in Ireland and topped the charts in several Eastern European countries including Hungary or Croatia.[192][193][194]
On May 31, 2019, Cyrus tweeted that her seventh studio album would be titled She Is Miley Cyrus and would comprise three six-song EPs, which would be released before the full-length album: She Is Coming on May 31, She Is Here in the summer, and She Is Everything in the fall.[200]She Is Coming, which also included vocal collaborations with RuPaul, Swae Lee, Mike Will Made It and Ghostface Killah, debuted at number five on the US Billboard 200 with 36,000 album-equivalent units,[201] while the lead single "Mother's Daughter" entered at number 54 on the US Billboard Hot 100.[202] The Wuki remix of "Mother's Daughter" received a nomination for Best Remixed Recording at the 62nd Annual Grammy Awards while the original music video won two MTV Video Music Awards.[203][204] Cyrus promoted the EP with a summer European tour that visited A-list festivals like Glastonbury and Primavera Sound.[205]
Cyrus starred in "Rachel, Jack and Ashley Too", an episode of the Netflix sci-fi series Black Mirror, which was filmed in South Africa in November 2018. It was released on Netflix on June 5, 2019.[206] In the episode, she played fictional pop star Ashley O and voiced her AI doll extension, Ashley Too. The plot was compared to Britney Spears's conservatorship and the Free Britney movement, which Cyrus has been an advocate for.[207] The music video for the song "On a Roll" from the episode was released on June 13;[208] the song itself and the B-side "Right Where I Belong" were released to digital platforms the next day.[209]
On June 27, it was revealed that Cyrus had collaborated with Ariana Grande and Lana Del Rey on "Don't Call Me Angel", the lead single of the soundtrack to the 2019 film Charlie's Angels.[210] It was released on September 13, 2019.[211] In August 2019, Cyrus released "Slide Away", her first song since announcing her separation from then-husband Hemsworth. The song hinted at their breakup and contained lyrics such as "Move on, we're not 17, I'm not who I used to be".[212] A music video was released in September 2019 that contained further references, including a ten of hearts playing card at the bottom of a pool to represent the end of her decade-long relationship with Hemsworth.[213]
On August 14, 2020, Cyrus released the lead single from her seventh studio album, "Midnight Sky" and confirmed the cancellation of the EPs She Is Here and She Is Everything due to major recent changes in her life that did not fit the essence of the project, including her divorce from Hemsworth, and the burning of the couple's house during the Woolsey Fire in California.[214][215] "Midnight Sky" became her highest-charting solo single since "Malibu" in 2017, peaking at number 14 on the US Billboard Hot 100. Internationally, in the United Kingdom the song has thus far peaked at number five on the UK Singles Chart. The track was later mashed up with Stevie Nicks' "Edge of Seventeen".[216]
In October, Cyrus had a third Backyard Session on MTV and announced via Instagram that her seventh studio album Plastic Hearts would be released on November 27, 2020.[217][218] It was previously intended to be called She Is Miley Cyrus, completing the EP series once finalized.[219] The album was released to positive reviews from critics and performed well, debuting at number two on the Billboard 200, with 60,000 units, becoming her twelfth top ten entry on the chart. With that entry, Cyrus broke the record for attaining the most US Billboard 200 top-five albums in the 21st century by a female music artist. Plastic Hearts marked a step of Cyrus into rock and glam rock music and spawned two other singles: "Prisoner" featuring English singer Dua Lipa and "Angels like You", which peaked at 8 and 66 respectively in the United Kingdom.[220] The album also included vocal collaborations with Billy Idol and Joan Jett. Due to popular demand and social media virality, Cyrus included the live covers of Blondie's "Heart of Glass" and The Cranberries' "Zombie".[221]
Cyrus won a 2020 Webby Special Achievement Award.[222] In February 2021, Cyrus performed at the first TikTok Tailgate show in Tampa, for 7,500 vaccinated healthcare workers. It served as a pre-show before Super Bowl LV. It aired on TikTok and CBS.[223] The performance was featured in the music video for "Angels like You".[224] In March 2021, Cyrus departed RCA and signed with Columbia Records, a sister label of RCA under the Sony Music umbrella.[225][226] That same month Cyrus embraced her days as Hannah Montana and wrote an open letter to the character on social media for the show's 15th anniversary, despite all statements that her days as Montana gave Cyrus an identity crisis.[227][228] Rumors about a possible revival of the show have been around ever since.[229] On April 23, 2021, The Kid Laroi released a remix of his single "Without You" featuring Cyrus, her first release under Columbia Records.[230] On April 3, 2021, Cyrus performed at the NCAA March Madness Final Four in Indianapolis with the frontline health care workers in the audience.[231]
In May 2021, Cyrus signed an overall deal with NBCUniversal, including a first-look deal with her studio Hopetown Entertainment, as part of which she will develop projects for the company's outlets and star in three specials; with the first project off the deal being the Stand By YouPride concert special, which was released the following month on Peacock.[232][233] In June, Cyrus released a studio cover version of Metallica's "Nothing Else Matters", which was included in The Metallica Blacklist, a tribute album to the band's homonymous record, featuring renditions recorded by various artists and released in conjunction to the original album's 30th anniversary.[234] The track also features Elton John on the piano, Yo-Yo Ma and Red Hot Chili Peppers' Chad Smith.[235] The singer initially teased a Metallica cover album in October 2020 and had already performed the track live during her set at Glastonbury.[236]
In February 2022, Cyrus embarked on her music festival concert tour, Attention Tour, in support of Plastic Hearts, which took place in North, South, and Central America. This marked her first tour to South America since her Gypsy Heart Tour in 2011. The tour concluded on March 26, 2022.[241][242][243] On April 1, 2022, Cyrus released her third live album, Attention: Miley Live.[244] Most of the album was recorded during her concert as part of the Super Bowl Music Fest at the Crypto.com Arena in Los Angeles on February 12, 2022, with the set list including songs from her albums Plastic Hearts, Miley Cyrus & Her Dead Petz, Bangerz, The Time of Our Lives, Breakout, and Meet Miley Cyrus, along with multiple cover songs. The album also includes two unreleased tracks—"Attention" and "You". She said the album was "curated by the fans for the fans".[245] Emily Swingle of Clash gave acclaim to Cyrus's versatile vocals, saying her "voice is truly a force to be reckoned with, seamlessly fitting whatever genre she chooses to tackle. From the playful, country-hip-hop banger that is '4x4', to rap-heavy '23', to the bluesy, rich cover of Janis Joplin's 'Maybe', it seems like Cyrus can fit into just about any genre she gets her paws on."[246] By the end of that month, Cyrus released the deluxe version of the album, which includes six additional songs including a mashup of "Mother's Daughter" and "Boys Don't Cry" featuring Anitta, that are mostly part of her time at the Lollapalooza festival in Brazil and other shows in Latin America; she commented on the addition of her single "Angels Like You" at her concert in Colombia in gratitude due to the song reaching the number one spot on iTunes in that country and because her fans sang the song all night outside the hotel where she was staying in Bogotá.[247][248] The following month, NBC announced that Miley's New Year's Eve Party had been renewed for a second iteration set to be aired on New Year's Eve 2022–23.[249][250] In August 2022, it was announced that Cyrus was set to star in a Christmastelevision filmDolly Parton's Mountain Magic Christmas, produced by Dolly Parton for NBC.[251]
2023–present: Endless Summer Vacation and Something Beautiful
In late 2022, Cyrus and her longtime collaborator Mike Will Made It teased new music to be released in 2023.[252] Days later, during the second edition of Miley's New Year's Eve Party, the singer's next lead single, "Flowers", was announced.[253][254] It was released on January 13, 2023,[255][256] and debuted at number one on the Billboard Hot 100, Global 200, and Global Excl. US charts.[257] With thirteen weeks each atop the Global 200 and Global Excl. US charts, it became the longest-running leader on the former chart, at the time.[258] Topping the Hot 100 for eight non-consecutive weeks, it was Cyrus's second US number-one single—her first in a decade, since "Wrecking Ball" (2013)—and her longest-running chart-topper.[259][260] "Flowers" became a global success[261] topping the charts in 37 countries,[262] including Australia, Canada, France, Germany, and the UK.[263] It was 2023's most-streamed and most-downloaded song on various platforms in numerous countries,[264] and the most-consumed song on US radio.[265] On Spotify, the single became the fastest track to surpass 100 million and 1 billion plays (7 and 112 days), at the time.[266][267] Due to 57 weeks atop the BillboardAdult Contemporary chart, it became the longest-running number one song on any Billboardairplay chart in history.[268] It also earned the most cumulative weeks atop all Billboard airplay charts (106 weeks) of all time.[269] "Flowers" topped the year-end charts in various regions,[270] and ranked as the second best-performing song of the year on the year-end Hot 100 chart of 2023.[271] According to the International Federation of the Phonographic Industry (IFPI), it was the best-selling song in 2023 globally.[272] A demo version of the track was made available on March 3, 2023.[273] "Flowers" was certified seven-times platinum by the RIAA in March 2025.[274]
Cyrus's eighth studio album, Endless Summer Vacation, was released on March 10, 2023. She produced it with Kid Harpoon, Greg Kurstin, Mike Will Made It, and Tyler Johnson.[275] Cyrus's first studio effort with Columbia Records, she described the record as "[her] love letter to LA", which reflects upon physical and mental growth she experienced during production.[276] Cyrus decided to primarily focus on songcraft, before handling production, on the album—a pop and dance-pop record.[277] It debuted at number three on the US Billboard 200 with first-week sales of 119,000 album-equivalent units, marking Cyrus's tenth top-five and fourteenth top-ten entry on the chart.[278] "River", the second single off the record, was released on March 13, 2023,[279][280] and reached number 32 on the US Hot 100.[281] "Jaded", which peaked at number 56, became the third single in April 2023.[282][283]Endless Summer Vacation was the 19th best-selling album globally in 2023, according to the IFPI.[284]
A documentary concert special—as part of Cyrus's Backyard Sessions series—titled Endless Summer Vacation (Backyard Sessions), premiered on Disney+ on March 10, accompanying the album's release.[285] Executive produced by Cyrus, it features her performing songs from the album, and her 2009 single "The Climb", with an appearance by Rufus Wainwright.[286][287] In June, she voiced Van, a nihilistic female creature, in the second season of the Netflix adult animated sitcom Human Resources.[288] An updated version of Cyrus's Disney+ special, titled Endless Summer Vacation: Continued (Backyard Sessions), premiered on ABC on August 24, 2023. The day after, Cyrus released the single "Used to Be Young", which was included in the digital reissue of Endless Summer Vacation[289] and debuted at number eight in the US.[290] On October 20, Dolly Parton released a rock re-recording of Cyrus's "Wrecking Ball" featuring her as a guest vocalist, as the final single off her studio album Rockstar (2023).[291]Billboard ranked Cyrus as the ninth-best-selling musician of 2023.[292]
Cyrus possesses a mezzo-sopranovocal range,[328] although her vocals were once described as alto[329] with a "Nashville twang" in both her spoken and singing voice. Her voice has a distinctive raspy sound to it, similar in vein to that of Pink and Amy Winehouse.[330][331][99] On "Party in the U.S.A." (2009), her vocals feature belterrefrains,[332] while those on the song "Obsessed" (2009) are described as "husky".[333] Releases such as "The Climb" (2009) and "These Four Walls" (2008) feature elements of country music and showcase Cyrus's "twangy vocals".[334] Cyrus experimented with an electropop sound on "Fly on the Wall" (2008), a genre that she would explore further with the release of Can't Be Tamed (2010), her third studio album.[335] It was initially intended to feature rock elements prior to its completion,[336] and Cyrus claimed after its release that it could be her final pop album.[337] The album's songs speak of Cyrus's desire to achieve freedom in both her personal and professional life.[337] She began working on Bangerz (2013) during a musical hiatus, and described the record as having a "dirty south feel" prior to its release.[338] Critics noted the use of hip hop and synthpop on the album.[339] The album's songs are placed in chronological order telling the story of her failed relationship with Liam Hemsworth.[340] Cyrus described Miley Cyrus & Her Dead Petz (2015) as "a little psychedelic, but still in that pop world".[156] For her rock-influenced album, Plastic Hearts, Cyrus cited Britney Spears and Metallica as major influences.[341] Inspired by pop and dance-pop, Endless Summer Vacation (2023) "feels like a recap of her career's 15-plus years, with Cyrus breezing through genres with the ease of a well-seasoned tourist."[342] Cyrus related its overall concept to her affection for Los Angeles.
This section needs to be updated. Please help update this article to reflect recent events or newly available information.(July 2023)
Cyrus's controversial musical performances have received significant media attention, including on her Bangerz Tour (2014) and Milky Milky Milk Tour (2015).[343] Her performance of "Party in the U.S.A." at the 2009 Teen Choice Awards sparked a "national uproar" because of her outfit and perceived pole dancing.[344][345] She faced similar controversy over her performance of "Can't Be Tamed" (2010) on Britain's Got Talent, where the singer pretended to kiss one of her female backup dancers onstage;[346] she defended the performance, arguing that she did nothing wrong.[346]
Cyrus became the subject of media and public scrutiny following her performance of "We Can't Stop" (2013) and "Blurred Lines" (2013) with Robin Thicke at the 2013 MTV Video Music Awards. Clad in a flesh-colored latex two-piece, she touched Thicke's crotch area with a giant foam finger and twerked against his crotch.[347] The performance resulted in a media frenzy; one reviewer likened the performance to a "bad acid trip",[133] while another described it as a "trainwreck in the classic sense of the word as the audience reaction seemed to be a mix of confusion, dismay and horror in a cocktail of embarrassment".[348] Cyrus entered the stage of her Bangerz Tour by sliding down a slide in the shape of a tongue, and draw media attention during the tour for her unique outfits and racy performances.[349]
In the early years of her career, Cyrus had a generally wholesome image as a teen idol.[350] Her fame increased dramatically in the wake of the Vanity Fair photo scandal in 2008, and it was reported that photographs of Cyrus could be sold to photo agencies for up to $2,000 per photo.[350] In subsequent years, her image continued to shift dramatically from her teen idol status.[350] In 2008, Donny Osmond wrote of Cyrus's imminent transition to adulthood: "Miley will have to face adulthood... As she does, she'll want to change her image, and that change will be met with adversity."[351] The release of her 2010 album Can't Be Tamed saw Cyrus officially attempting to distance herself from her teenage persona by releasing controversial music videos for her songs "Can't Be Tamed" and "Who Owns My Heart".[352][353] Her behavior throughout 2013 and 2014 sparked a substantial amount of controversy, although her godmother Dolly Parton said "...the girl can write. The girl can sing. The girl is smart. And she doesn't have to be so drastic. But I will respect her choices. I did it my way, so why can't she do it her way?"[354] Liel Leibovitz at Tablet noted in 2013: "Talking to the website Hunger, the singer argued that those adults who deem her gyrations too sultry and her music too saccharine were simply too ancient—and Jewish—to get it. 'With magazines, with movies, it's always weird when things are targeted for young people yet they're driven by people that are like 40 years too old', Cyrus opined. And one group stands out in [her] mind as deserving of most of the blame: 'It can't be like this 70-year-old Jewish man that doesn’t leave his desk all day, telling me what the clubs want to hear.'"[355]
Cyrus was ranked number 17 on Forbes's list of the most powerful celebrities in 2014; the magazine notes that "The last time she made our list was when she was still rolling in Hannah Montana money. Now the pop singer is all grown up and courting controversy at every turn."[356] In August 2014, her life was documented in a comic book titled Fame: Miley Cyrus; it begins with her controversial 2013 MTV Video Music Awards performance and covers her Disney fame as well as exploring her childhood in Tennessee.[357] The comic book was written by Michael L. Frizell, drawn by Juan Luis Rincón, and is available in both print and digital formats.[358] In September 2010, Cyrus placed tenth on Billboard's first-ever edition of its 21 Under 21 listicle;[359] she was ranked twenty-first in 2011[360] and eighteenth in 2012.[361] In 2013, Maxim listed Cyrus as number one on their annual Hot 100 list.[362] Cyrus was chosen by Time magazine as one of the finalists for Person of the Year in November 2013;[363] she came in third place with 16.3% of the staff vote.[364] In March 2014, Skidmore College in New York began to offer a special topics sociology course entitled "The Sociology of Miley Cyrus: Race, Class, Gender and Media" which was "using Miley as a lens through which to explore sociological thinking about identity, entertainment, media and fame".[365] In 2015, Cyrus was listed as one of the nine runners-up for The Advocate's Person of the Year.[366] In March 2024, to commemorate the 65th anniversary of International Women's Day, Cyrus was one of a number of celebrities who had their likeness turned into Bratz doll.[367]
Cyrus resides in Hidden Hills, California, and also owns a $5.8 million home in her hometown of Franklin.[368][369] She was raised as a Christian and identified herself as such during her childhood and early adult life,[24] but she included references to Tibetan Buddhism in her 2015 song "Milky Milky Milk"[370] and is also influenced by Hindu beliefs.[371]
Cyrus came out to her mother at age 14[374][375][376] and has said: "I never want to label myself! I am ready to love anyone that loves me for who I am! I am open."[377] In June 2015, Time magazine reported that she identified as gender fluid.[378][375][376][377] She said she "doesn't relate to being boy or girl, and I don't have to have my partner relate to boy or girl",[374] adding that she is "literally open to every single thing that is consenting and doesn't involve an animal and everyone is of age".[374]
Cyrus is a supporter of the LGBT community.[379] Her 2010 song "My Heart Beats for Love" was written for one of her gay friends,[380] and she has since said London is her favorite place to perform due to its extensive gay scene.[381] Cyrus has an equals sign tattooed on her ring finger in support of same-sex marriage.[382] After her 2018 marriage to a man, Cyrus went on record to say she still identified as queer.[383] In 2014, she founded the Happy Hippie Foundation, which works to "fight injustice facing homeless youth, LGBTQ youth and other vulnerable populations".[384]
Cyrus became a vegan and stopped eating animal products in 2014.[385] In 2020, she said on The Joe Rogan Experience that she had to switch to a pescatarian diet after suffering from omega-3 deficiency: "I've been a vegan for a very long time and I had to introduce fish and omegas into my life because my brain wasn't functioning properly."[385] Cyrus said she cried when eating her first fish after her vegan diet, saying "I cried for the fish ... it really hurts me to eat fish."[386] Her decision to quit being vegan sparked backlash from people in the vegan and vegetarian community, who accused Cyrus of "spreading misinformation about omega-3" and "abandoning her vegan diet".[387]
Cyrus has been open about her recreational use of cannabis.[388][389] She told Rolling Stone in 2013 that it was "the best drug on earth" and called it, along with MDMA, a "happy drug".[390] While accepting the Best Video Award at the 2013 MTV Europe Music Awards, Cyrus appeared to smoke a joint onstage; this was removed from the delayed broadcast of the show in the United States.[391] In a 2014 interview with W magazine, Cyrus said "I love weed" and "I just love getting stoned."[392] In a 2017 interview on The Tonight Show Starring Jimmy Fallon, she said she had quit cannabis before the press tour for Younger Now so she could be "super clear" when discussing the record.[393] In May 2018, she told Jimmy Kimmel: "I also think it's the most magical, amazing... it's my first and true love. It's just not for me right now at this time in my life, but I'm sure there will be a day I will happily indulge."[394] During a December 2018 interview with Andy Cohen, she credited her mother for reintroducing her to cannabis.[395] In 2019, Cyrus sent "Nothing Breaks Like a Heart" collaborator Mark Ronson a cannabis bouquet from Lowell Herb Co as a tongue-in-cheek Valentine's Day gift.[396] She invested in the company in August.[397]
Before and shortly after vocal cord surgery in November 2019, Cyrus said she had abstained from cannabis and alcohol.[398][399][400][401][402]
Cyrus has said that she dated singer-actor Nick Jonas from June 2006 to December 2007,[403] claiming they were "in love" and began dating soon after they first met.[404] Their relationship attracted considerable media attention.[405] Cyrus was in a nine-month relationship with model Justin Gaston from 2008 to 2009.[406] In 2009, while filming The Last Song, she began an on-again, off-again relationship with her co-star Liam Hemsworth.[407] During the breakups, Cyrus was romantically linked to actors Lucas Till (2009) and Josh Bowman (2011).[408] Cyrus and Hemsworth were first engaged from May 2012 to September 2013.[409][410] She has also dated actor Patrick Schwarzenegger (2014–2015) and model Stella Maxwell (2015).[411][407][412]
Cyrus and Hemsworth rekindled their relationship in March 2016,[413][414] and got engaged again that October.[415] In November 2018, their home burned down in the Woolsey Fire in California.[416][417] On December 23, Cyrus and Hemsworth married in a private ceremony at their home in Nashville.[418] She said her marriage redefined "what it looks like for someone that's a queer person like me to be in a hetero relationship" while "still very sexually attracted to women". Cyrus said the ceremony was "kind of out of character for me" because they had "worn rings forever [and] definitely didn't need it in any way". She believed the loss of their home to be the catalyst for the wedding, saying "the timing felt right" and that "no one is promised the next day, or the next, so I try to be 'in the now' as much as possible".[419] On August 10, 2019, Cyrus announced their separation;[420] on August 21, Hemsworth filed for divorce, citing "irreconcilable differences".[421] Their divorce was finalized on January 28, 2020.[422]
After announcing her separation from Hemsworth, Cyrus dated Kaitlynn Carter from August to September 2019.[423][424][425] In October 2019, Cyrus began dating Australian singer Cody Simpson, a longtime friend.[426] In August 2020, Cyrus announced that she and Simpson had split up.[427] Her announcement coincided with the release of her single "Midnight Sky", which was inspired by her breakups with Hemsworth, Carter, and Simpson.[428][429][430] In 2021, Cyrus began dating American musician Maxx Morando, who worked as a producer on her 2023 album Endless Summer Vacation.[431][432]
Cyrus performing at the Kids Inaugural: We Are the Future concert in 2009
In January 2011, Cyrus met an ailing fan with spina bifida with the charity Kids Wish Network.[460] In April 2011, she appeared in a commercial for the American Red Cross asking people to pledge $10 to help those affected by the 2011 Tōhoku earthquake and tsunami.[461] That same year, Hilary Duff presented Cyrus with the first-ever Global Action Youth Leadership Award at the first Annual Global Action Awards Gala for her support of Blessings in a Backpack, an organization that works to feed hungry children in schools, and her personal Get Ur Good On campaign with the Youth Services of America. Cyrus stated: "I want (kids) to do something they love. Not something that seems like a chore because someone tells them that's the right thing to do or what their parents want or what's important to people around them, but what's in their heart."[462][463] In December 2011, she appeared in a commercial for the charity J/P Haitian Relief Organization, and teamed up with her elder brother Trace Cyrus to design a limited-edition T-shirt and hoodie for charity. All proceeds from the sale of these items went to her charity, Get Ur Good On, which supports education for under-privileged children.[464][465] That month, she performed "The Climb" at the CNN Heroes: An All-Star Tribute at the Shrine Auditorium in Los Angeles.[466]
At the 2014 MTV Video Music Awards, Cyrus won Video of the Year for her song Wrecking Ball. Instead of accepting the award herself, she invited a 22-year-old homeless man by the name of Jesse to collect it on her behalf; she had met him at My Friend's Place, an organization that helps homeless youth find shelter, work, health care, and education. His acceptance speech urged musicians to learn more about youth homelessness in Los Angeles through Cyrus's Facebook page.[477] Cyrus then launched a Prizeo campaign to raise funds for the charity; those who made donations were entered into a sweepstake for a chance to meet Cyrus on her Bangerz Tour in Rio de Janeiro that September.[478] In early 2015, Cyrus teamed up with MAC Cosmetics to launch her own branded Viva Glam lipstick, with proceeds to the Mac AIDS Fund.[479]
In June 2017, Cyrus performed at One Love Manchester, a televised benefit concert organized by Ariana Grande following the Manchester Arena bombing on her concert two weeks earlier.[480] During an appearance on The Ellen DeGeneres Show in August 2017, Cyrus said that she would donate $500,000 to Hurricane Harvey relief efforts.[481] In August 2019, she performed at the Sunny Hill Festival in Kosovo, a festival to raise funds to help people with financial difficulties there, created by Dua Lipa and her father.[482] In September 2019, Cyrus met with another fan through the Make-A-Wish Foundation at the 2019 iHeartRadio Music Festival in Las Vegas, Nevada.[483][459] Cyrus and her boyfriend Cody Simpson donated 120 tacos to healthcare workers amid the COVID-19 pandemic in April 2020.[484] That same month, she partnered again with MAC Cosmetics's annual Viva Glam campaign to donate $10 million toward 250 local organizations nationwide heavily impacted by the pandemic.
Cyrus is the founder of the Happy Hippie Foundation, which works to "fight injustice facing homeless youth, LGBTQ youth, and other vulnerable populations".[384] From 2014 to 2016 the foundation served nearly 1,500 homeless youth in Los Angeles, reached more than 25,000 LGBTQ youth and their families with resources about gender, and provided social services to transgender individuals, youth in conflict zones, and people affected by crises.[486] Happy Hippie encourages Cyrus's fans to support causes including gender equality, LGBTQ rights and mental health through awareness campaigns and fundraising. Leading up to the 2020 presidential election, Happy Hippie encouraged its Instagram followers to seek out VoteRiders for assistance ensuring that gender identity would not affect their right to vote.
On June 15, 2015, Cyrus launched the campaign #InstaPride[487] in collaboration with Instagram. The campaign features a series of portraits starring transgender and gender-expansive people, which were posted to her Instagram feed with the hashtags "#HappyHippiePresents" and "#InstaPride". It stated that it was aimed at encouraging diversity and tolerance by showing these people in a positive light as examples for others who might be struggling to figure themselves out, and as a reference point for people who didn't know personally anyone in that situation. Cyrus was behind the camera for the entire photoshoot, and interviewed her 14 subjects to share their personal stories. She said she wanted to bring attention to and celebrate people who would not normally find themselves being the stars of a photoshoot or portrayed on the cover of a magazine.[488]
Following the loss of Miley and Hemsworth's Malibu home from the Woolsey Fire, the community and they launched the Malibu Foundation for relief efforts following the 2018 California wildfires,[489] Miley's Happy Hippie Foundation donating $500,000 to the Malibu Foundation.[490][491]
In 2024 Cyrus announced that the foundation would be renamed as Miley Cyrus Foundation.[492]
Cyrus has been compared to the "Queen of Pop", Madonna, being considered by many as the "Madonna of her generation".[493]
Cyrus's early success as a teen idol and the face of Disney Channel's billion-dollar franchise Hannah Montana[494] played an important role in shaping the 2000s teen pop culture, earning her the honorific nickname of "Teen Queen".[495][496][497] Bickford stated Hannah Montana adopted a business model of combining celebrity acts with film, television, and popular music for a pre-adolescent audience. He called the series "genre-defining"[498] and likened this model to 1990steen pop artists such as Britney Spears and NSYNC, who were also marketed to children.[498] Morgan Genevieve Blue of Feminist Media Studies stated the series' primary female characters, Miley and her alter ego Hannah, are positioned as post-feminist subjects in a way their representation is confined to notions of femininity and consumerism.[499]The Times journalist Craig McLean named Cyrus the "world's biggest-ever teenage star".[500]
During the Best of Both Worlds Tour, tickets were sold out in minutes and stadiums were completely filled making it the highest-grossing concert tour for a new act in 2007 and 2008.[501] According to Billboard boxscore,[502] the Best of Both Worlds Tour had a total attendance of approximately one million people[503] and grossed over US$54 million, earning Cyrus the award for Breakthrough Act at the 2008 Billboard Touring Awards.[501] In 2012, Rolling Stone ranked Cyrus as one of the top 25 teen idol breakout moments of the rock era, which Andy Greene wrote: "Miley's rise was meteoric. Tickets to her 2007 Best of Both Worlds tour sold out faster than any tour in memory ... It seemed like she was poised to become a more stable version of Britney Spears – especially after singles 'The Climb' and 'Party In The USA'".[504] Due to her popularity, Paul McCartney compared their success to that of the Beatles in an interview during his tour in 2011. In this regard, he commented: "I think when they have new sensations, like Miley Cyrus or Justin Bieber, teenagers identify with them, in the same way that the boys identified with The Beatles, [...] when you have thousands of teenagers feeling the same, they become elated because they have this love for something in common, whether it is The Beatles, Miley Cyrus, Justin Bieber, or whatever."[505][506]
Over the years, Cyrus's song "Party in the U.S.A." gained popularity in American culture on holidays and historic events. The song has re-entered the charts every Independence Day since its release. Following the death of Osama bin Laden on May 2, 2011, a resurgence in popularity of the music video occurred. The official YouTube video was flooded with comments regarding the death of bin Laden and it was immediately deemed a celebratory anthem for the event.[507] In 2013, an online petition on the White House's "We the People" petitions website was urging then-president Barack Obama to change the U.S. national anthem from "The Star-Spangled Banner" to "Party in the U.S.A."[508][509] Following the 2020 presidential election, as major news outlets announced Democratic nominee Joe Biden the winner of the presidential race, on November 7, 2020, supporters in New York City started singing "Party in the U.S.A." at Times Square.[510]
Cyrus's album Bangerz (2013), along with its promotional events, is considered to be one of the most controversial moments in the 2010s wider popular culture and established Cyrus among the decade's most controversial figures.[511]Glamour writer Mickey Woods likened the promotional "era" for the album to those of Britney Spears's and Christina Aguilera's third and fourth studio albums Britney (2001) and Stripped (2002), respectively, adding that Cyrus's record "will probably be retrospectively deemed iconic, maybe even classic".[512]Billboard listed Bangerz as one of the greatest and most influential albums of the 2010s noting that "with this pivotal album release, Cyrus took control of her public persona, surprising less with her provocative antics than with her constant artistic evolution".[513] The album was ranked number 230 on Rolling Stone's "250 Greatest Albums of the 21st Century".[514]Bangerz is considered a trendsetter in "weaving together urban and pop influences, what's most revered now is what it represented then" according to Lyndsey Havens.[515] Patrick Ryan of USA Today commented that Cyrus's collaborations with Mike Will Made It on the album contributed to his new-found prominence, stating that Mike Will Made It's position as an executive producer has helped him "[jump] to the forefront as an interesting character [...] in an era where a lot of producers have fallen behind the scenes again".[516]Vice described Cyrus as "the most punk rock musician out there" and that she was "spinning circles around every single pop star who [was] trying to be edgy" at the time.[517]MTV named Cyrus their Best Artist of 2013, and James Montgomery of MTV News elaborated on the network's decision that Cyrus "[declared] her independence and [dominated] the pop-culture landscape", adding that "she schooled—and shocked—us all in 2013, and did so on her own terms."[518]Billboard staff called Cyrus the "Most Talked About Pop Star" of 2013, and also recognized the controversial evolution of her career as the "Top Music Moment" of the year, elaborating that she was a "maelstrom that expanded and grazed nearly every aspect of pop culture in 2013".[519] The publication also ranked "We Can't Stop" as the best song of 2013 for being "one of the bolder musical choices in recent memory",[520] and as one of the songs that defined the 2010s decade.[515] The song's music video and Cyrus's controversial 2013 VMAs performance with Robin Thicke were declared as the 27th greatest music video and one of the most "defining" pop culture moments of the 2010s.[521][522]
In 2015, Rebecca Nicholson from The Guardian published an article calling Cyrus the Madonna of her generation, saying that "she's a Disney survivor with a fluid approach to gender identity. And, like the old three-chord punks, she gives really good quote". According to Nicholson, Cyrus takes "the 90s Madonna approach to public sexuality: it's deliberately provocative, and crucially, it is not being served up for male consumption." Likewise, she defends Cyrus's controversial rebellion, highlighting that behind the character there is a human, talented and strong person who manages to connect with the public, just like the "Queen of Pop".[493] In November of the same year, Vulture ranked Cyrus number one on its "Disney and Nickelodeon Stars Gone Pop" listicle, writing that "no post-millennium child star [had] grown up as wildly, rapidly, or successfully as [Cyrus]" at the time.[523] Appearing at number eight on the 2021 revision of the ranking, the publication named her as one of the few child stars with a successful music career as an adult, calling her "the archetype for Disney 2.0 stars" who "picked up the child-star trap of getting pigeonholed and set it on fire".[524]Billboard cataloged the singer as one of the "Greatest of All Time Billboard 200 Artists", occupying position thirty-one; she was on the ninth rank among female artists.[525] In 2017, the aforementioned magazine also published an article naming the singer a "Queer Superhero" for her philanthropic fight for the LGBTQ+ community.[526] In 2019, Billboard ranked her 62nd on its "Greatest of All Time Artists" chart,[527] and 55th on the 2010s decade-end chart of Top Artists, signifying the most successful acts of the decade.[528][529]
Due to her continual artistic reinventions, sonic and stylistic evolution, and versatility, Cyrus has been nicknamed the "Pop Chameleon" by media and various publications.[530][524] She has also been considered a pop icon by several publications,[531][532] with the BBC calling her "the ultimate 21st century pop star".[533] In 2023, The Hollywood Reporter named Cyrus as one of its "Platinum Players" in music.[534]Billboard included Cyrus in its "Greatest Pop Stars of 2023" listicle, naming her the "Comeback Artist of the Year". The magazine called her 2023 single "Flowers" the "biggest chart smash of her career" and noted that it "re-established [Cyrus] as one of pop's foremost hitmakers".[532] In 2024, at age 31, Cyrus became the youngest recipient of the Disney Legends award, for her outstanding contributions to the Walt Disney Company.[535] That year, she was ranked number 15 on Billboard's "Greatest Pop Stars of the 21st Century"; the magazine wrote that Cyrus has "endured as one of the century's most significant pop stars—because no matter what style she's trying out, at the end of the day, she's always still just being Miley".[536] In 2025, Billboard ranked her twenty-first on its "Top Artists of the 21st Century" list,[537] and ninth on its "Top 100 Women Artists of the 21st Century" list.[538] Artists that have cited Cyrus or her work as inspiration or an influence include Chappell Roan,[539]JoJo Siwa,[540]Lea Michele,[541]Lil Nas X,[542] and Troye Sivan.[543]
^ abcAbramowitz, Rachel (May 28, 2008). "All lenses are on Miley". Los Angeles Times. Archived from the original on February 1, 2012. Retrieved June 4, 2010.
In late August 1961, a few weeks after he was born, Barack and his mother moved to the University of Washington in Seattle, where they lived for a year. During that time, Barack's father completed his undergraduate degree in economics in Hawaii, graduating in June 1962. He left to attend graduate school on a scholarship at Harvard University, where he earned a Master of Arts in economics. Obama's parents divorced in March 1964.[22] Obama Sr. returned to Kenya in 1964, where he married for a third time and worked for the Kenyan government as the senior economic analyst in the Ministry of Finance.[23][page needed] He visited his son in Hawaii only once, at Christmas 1971,[24] before he was killed in an automobile accident in 1982, when Obama was 21 years old.[25] Recalling his early childhood, Obama said: "That my father looked nothing like the people around me—that he was black as pitch, my mother white as milk—barely registered in my mind."[19] He described his struggles as a young adult to reconcile social perceptions of his multiracial heritage.[26]
Obama's Indonesian school record in St. Francis of Assisi Catholic Elementary School. Obama was enrolled as "Barry Soetoro" (no. 1), and was wrongly recorded as an Indonesian citizen (no. 3) and a Muslim (no. 4).[29]
At the age of 6, Obama and his mother had moved to Indonesia to join his stepfather. From age six to ten, he was registered in school as "Barry"[29] and attended local Indonesian-language schools: Sekolah Dasar Katolik Santo Fransiskus Asisi (St. Francis of Assisi Catholic Elementary School) for two years and Sekolah Dasar Negeri Menteng 01 (State Elementary School Menteng 01) for one and a half years, supplemented by English-language Calvert School homeschooling by his mother.[30][31] As a result of his four years in Jakarta, he was able to speak Indonesian fluently as a child.[32] During his time in Indonesia, Obama's stepfather taught him to be resilient and gave him "a pretty hardheaded assessment of how the world works".[33]
In 1971, Obama returned to Honolulu to live with his maternal grandparents, Madelyn and Stanley Dunham. He attended Punahou School—a private college preparatory school—with the aid of a scholarship from fifth grade until he graduated from high school in 1979.[34] In high school, Obama continued to use the nickname "Barry" which he kept until making a visit to Kenya in 1980.[35] Obama lived with his mother and half-sister, Maya Soetoro, in Hawaii for three years from 1972 to 1975 while his mother was a graduate student in anthropology at the University of Hawaii.[36] Obama chose to stay in Hawaii when his mother and half-sister returned to Indonesia in 1975, so his mother could begin anthropology field work.[37] His mother spent most of the next two decades in Indonesia, divorcing Lolo Soetoro in 1980 and earning a PhD degree in 1992, before dying in 1995 in Hawaii following unsuccessful treatment for ovarian and uterine cancer.[38]
Of his years in Honolulu, Obama wrote: "The opportunity that Hawaii offered — to experience a variety of cultures in a climate of mutual respect — became an integral part of my world view, and a basis for the values that I hold most dear."[39] Obama has also written and talked about using alcohol, marijuana, and cocaine during his teenage years to "push questions of who I was out of my mind".[40] Obama was also a member of the "Choom Gang" (the slang term for smoking marijuana), a self-named group of friends who spent time together and smoked marijuana.[41][42]
Two years after graduating from Columbia, Obama moved from New York to Chicago when he was hired as director of the Developing Communities Project, a faith-based community organization originally comprising eight Catholic parishes in Roseland, West Pullman, and Riverdale on Chicago's South Side. He worked there as a community organizer from June 1985 to May 1988.[50][52] He helped set up a job training program, a college preparatory tutoring program, and a tenants' rights organization in Altgeld Gardens.[53] Obama also worked as a consultant and instructor for the Gamaliel Foundation, a community organizing institute.[54] In mid-1988, he traveled for the first time in Europe for three weeks and then for five weeks in Kenya, where he met many of his paternal relatives for the first time.[55][56]
In 1991, Obama accepted a two-year position as Visiting Law and Government Fellow at the University of Chicago Law School to work on his first book.[63][65] He then taught constitutional law at the University of Chicago Law School for twelve years, first as a lecturer from 1992 to 1996, and then as a senior lecturer from 1996 to 2004.[66]
From April to October 1992, Obama directed Illinois's Project Vote, a voter registration campaign with ten staffers and seven hundred volunteer registrars; it achieved its goal of registering 150,000 of 400,000 unregistered African Americans in the state, leading Crain's Chicago Business to name Obama to its 1993 list of "40 under Forty" powers to be.[67]
In a 2006 interview, Obama highlighted the diversity of his extended family: "It's like a little mini-United Nations," he said. "I've got relatives who look like Bernie Mac, and I've got relatives who look like Margaret Thatcher."[68] Obama has a half-sister with whom he was raised (Maya Soetoro-Ng) and seven other half-siblings from his Kenyan father's family, six of them living.[69] Obama's mother was survived by her Kansas-born mother, Madelyn Dunham,[70] until her death on November 2, 2008,[71] two days before his election to the presidency. Obama also has roots in Ireland; he met with his Irish cousins in Moneygall in May 2011.[72] In Dreams from My Father, Obama ties his mother's family history to possible Native American ancestors and distant relatives of Jefferson Davis, President of the Confederate States of America during the American Civil War. He also shares distant ancestors in common with George W. Bush and Dick Cheney, among others.[73][74][75]
Obama lived with anthropologist Sheila Miyoshi Jager while he was a community organizer in Chicago in the 1980s.[76] He proposed to her twice, but both Jager and her parents turned him down.[76][77] The relationship was not made public until May 2017, several months after his presidency had ended.[77]
Obama poses in the Green Room of the White House with wife Michelle and daughters Sasha and Malia, September 2009
In June 1989, Obama met Michelle Robinson when he was employed at Sidley Austin.[78] Robinson was assigned for three months as Obama's adviser at the firm, and she joined him at several group social functions but declined his initial requests to date.[79] They began dating later that summer, became engaged in 1991, and were married on October 3, 1992.[80] After suffering a miscarriage, Michelle underwent in vitro fertilization to conceive their children.[81] The couple's first daughter, Malia Ann, was born in 1998,[82] followed by a second daughter, Natasha ("Sasha"), in 2001.[83] The Obama daughters attended the University of Chicago Laboratory Schools. When they moved to Washington, D.C., in January 2009, the girls started at the Sidwell Friends School.[84] The Obamas had two Portuguese Water Dogs; the first, a male named Bo, was a gift from Senator Ted Kennedy.[85] In 2013, Bo was joined by Sunny, a female.[86] Bo died of cancer on May 8, 2021.[87]
Obama is a supporter of the Chicago White Sox, and he threw out the first pitch at the 2005 ALCS when he was still a senator.[88] In 2009, he threw out the ceremonial first pitch at the All-Star Game while wearing a White Sox jacket.[89] He is also primarily a Chicago Bears football fan in the NFL, but in his childhood and adolescence was a fan of the Pittsburgh Steelers and rooted for them ahead of their victory in Super Bowl XLIII 12 days after he took office as president.[90] In 2011, Obama invited the 1985 Chicago Bears to the White House; the team had not visited the White House after their Super Bowl win in 1986 due to the Space Shuttle Challenger disaster.[91] He plays basketball, a sport he participated in as a member of his high school's varsity team,[92] and he is left-handed.[93]
In 2005, the Obama family applied the proceeds of a book deal and moved from a Hyde Park, Chicago condominium to a $1.6million house (equivalent to $2.6million in 2024) in neighboring Kenwood, Chicago.[94] The purchase of an adjacent lot—and sale of part of it to Obama by the wife of developer, campaign donor and friend Tony Rezko—attracted media attention because of Rezko's subsequent indictment and conviction on political corruption charges that were unrelated to Obama.[95]
In December 2007, Money Magazine estimated Obama's net worth at $1.3million (equivalent to $2million in 2024).[96] Their 2009 tax return showed a household income of $5.5million—up from about $4.2million in 2007 and $1.6million in 2005—mostly from sales of his books.[97][98] On his 2010 income of $1.7million, he gave 14 percent to non-profit organizations, including $131,000 to Fisher House Foundation, a charity assisting wounded veterans' families, allowing them to reside near where the veteran is receiving medical treatments.[99][100] Per his 2012 financial disclosure, Obama may be worth as much as $10million.[101]
Obama is a Protestant Christian whose religious views developed in his adult life.[102] He wrote in The Audacity of Hope that he "was not raised in a religious household." He described his mother, raised by non-religious parents, as being detached from religion, yet "in many ways the most spiritually awakened person... I have ever known", and "a lonely witness for secular humanism." He described his father as a "confirmed atheist" by the time his parents met, and his stepfather as "a man who saw religion as not particularly useful." Obama explained how, through working with black churches as a community organizer while in his twenties, he came to understand "the power of the African-American religious tradition to spur social change."[103]
In January 2008, Obama told Christianity Today: "I am a Christian, and I am a devout Christian. I believe in the redemptive death and resurrection of Jesus Christ. I believe that faith gives me a path to be cleansed of sin and have eternal life."[104] On September 27, 2010, Obama released a statement commenting on his religious views, saying:
I'm a Christian by choice. My family didn't—frankly, they weren't folks who went to church every week. And my mother was one of the most spiritual people I knew, but she didn't raise me in the church. So I came to my Christian faith later in life, and it was because the precepts of Jesus Christ spoke to me in terms of the kind of life that I would want to lead—being my brothers' and sisters' keeper, treating others as they would treat me.[105][106]
In 2016, Obama said that he gets inspiration from a few items that remind him "of all the different people I've met along the way", adding: "I carry these around all the time. I'm not that superstitious, so it's not like I think I necessarily have to have them on me at all times." The items, "a whole bowl full", include rosary beads given to him by Pope Francis, a figurine of the Hindu deity Hanuman, a Coptic cross from Ethiopia, a small Buddha statue given by a monk, and a metal poker chip that used to be the lucky charm of a motorcyclist in Iowa.[112][113]
From 1994 to 2002, Obama served on the boards of directors of the Woods Fund of Chicago—which in 1985 had been the first foundation to fund the Developing Communities Project—and of the Joyce Foundation.[50] He served on the board of directors of the Chicago Annenberg Challenge from 1995 to 2002, as founding president and chairman of the board of directors from 1995 to 1999.[50] Obama's law license became inactive in 2007.[114][115]
State senator Obama and others celebrate the naming of a street in Chicago after ShoreBank co-founder Milton Davis in 1998
Obama was elected to the Illinois Senate in 1996, succeeding Democratic state senator Alice Palmer from Illinois's 13th District, which, at that time, spanned Chicago South Side neighborhoods from Hyde Park–Kenwood south to South Shore and west to Chicago Lawn.[116] Once elected, Obama gained bipartisan support for legislation that reformed ethics and health care laws.[117][118] He sponsored a law that increased tax credits for low-income workers, negotiated welfare reform, and promoted increased subsidies for childcare.[119] In 2001, as co-chairman of the bipartisan Joint Committee on Administrative Rules, Obama supported Republican governor George Ryan's payday loan regulations and predatory mortgage lending regulations aimed at averting home foreclosures.[120][121]
In January 2003, Obama became chairman of the Illinois Senate's Health and Human Services Committee when Democrats, after a decade in the minority, regained a majority.[125] He sponsored and led unanimous, bipartisan passage of legislation to monitor racial profiling by requiring police to record the race of drivers they detained, and legislation making Illinois the first state to mandate videotaping of homicide interrogations.[119][126][127][128] During his 2004 general election campaign for the U.S. Senate, police representatives credited Obama for his active engagement with police organizations in enacting death penalty reforms.[129] Obama resigned from the Illinois Senate in November 2004 following his election to the U.S. Senate.[130]
Obama campaign yard sign in Chicago, c. November 2004
In May 2002, Obama commissioned a poll to assess his prospects in a 2004 U.S. Senate race. He created a campaign committee, began raising funds, and lined up political media consultant David Axelrod by August 2002. Obama formally announced his candidacy in January 2003.[131]
Obama was an early opponent of the George W. Bush administration's 2003 invasion of Iraq.[132] On October 2, 2002, the day President Bush and Congress agreed on the joint resolution authorizing the Iraq War,[133] Obama addressed the first high-profile Chicago anti-Iraq War rally,[134] and spoke out against the war.[135] He addressed another anti-war rally in March 2003 and told the crowd "it's not too late" to stop the war.[136]
Decisions by Republican incumbent Peter Fitzgerald and his Democratic predecessor Carol Moseley Braun not to participate in the election resulted in wide-open Democratic and Republican primary contests involving 15 candidates.[137] In the March 2004 primary election, Obama won in an unexpected landslide—which overnight made him a rising star within the national Democratic Party, started speculation about a presidential future, and led to the reissue of his memoir, Dreams from My Father.[138] In July 2004, Obama delivered the keynote address at the 2004 Democratic National Convention,[139] seen by nine million viewers. His speech was well received and elevated his status within the Democratic Party.[140]
Obama's expected opponent in the general election, Republican primary winner Jack Ryan, withdrew from the race in June 2004.[141] Six weeks later, Alan Keyes accepted the Republican nomination to replace Ryan.[142] In the November 2004 general election, Obama won with 70 percent of the vote, the largest margin of victory for a Senate candidate in Illinois history.[143] He took 92 of the state's 102 counties, including several where Democrats traditionally do not do well.[citation needed]
In December 2006, President Bush signed into law the Democratic Republic of the Congo Relief, Security, and Democracy Promotion Act, marking the first federal legislation to be enacted with Obama as its primary sponsor.[150][151] In January 2007, Obama and Senator Feingold introduced a corporate jet provision to the Honest Leadership and Open Government Act, which was signed into law in September 2007.[152][153]
Later in 2007, Obama sponsored an amendment to the Defense Authorization Act to add safeguards for personality-disorder military discharges.[154] This amendment passed the full Senate in the spring of 2008.[155] He sponsored the Iran Sanctions Enabling Act supporting divestment of state pension funds from Iran's oil and gas industry, which was never enacted but later incorporated in the Comprehensive Iran Sanctions, Accountability, and Divestment Act of 2010;[156] and co-sponsored legislation to reduce risks of nuclear terrorism.[157] Obama also sponsored a Senate amendment to the State Children's Health Insurance Program, providing one year of job protection for family members caring for soldiers with combat-related injuries.[158]
Numerous candidates entered the Democratic Party presidential primaries. The field narrowed to Obama and Senator Hillary Clinton after early contests, with the race remaining close throughout the primary process, but Obama gained a steady lead in pledged delegates due to better long-range planning, superior fundraising, dominant organizing in caucus states, and better exploitation of delegate allocation rules.[168]
On June 2, 2008, Obama had received enough votes to clinch his nomination. After an initial hesitation to concede, on June 7, Clinton ended her campaign and endorsed Obama.[169] On August 23, 2008, Obama announced his selection of Delaware senator Joe Biden as his vice presidential running mate.[170] Obama selected Biden from a field speculated to include former Indiana governor and senator Evan Bayh and Virginia governor Tim Kaine.[170] At the Democratic National Convention in Denver, Colorado, Hillary Clinton called for her supporters to endorse Obama, and she and Bill Clinton gave convention speeches in his support.[171][172] Obama delivered his acceptance speech at Invesco Field at Mile High stadium to a crowd of about eighty-four thousand; the speech was viewed by over three million people worldwide.[173][174][175] During both the primary process and the general election, Obama's campaign set numerous fundraising records, particularly in the quantity of small donations.[176] On June 19, 2008, Obama became the first major-party presidential candidate to turn down public financing in the general election since the system was created in 1976.[177]
John McCain was nominated as the Republican candidate, and he selected Sarah Palin as his running mate. Obama and McCain engaged in three presidential debates in September and October 2008.[178] On November 4, Obama won the presidency with 365 electoral votes to 173 received by McCain.[179] Obama won 52.9 percent of the popular vote to McCain's 45.7 percent.[180] He became the first African-American to be elected president.[181] Obama delivered his victory speech before hundreds of thousands of supporters in Chicago's Grant Park.[182][183] He is one of the three United States senators moved directly from the U.S. Senate to the White House, the others being Warren G. Harding and John F. Kennedy.[184]
On April 4, 2011, Obama filed election papers with the Federal Election Commission and then announced his reelection campaign for 2012 in a video titled "It Begins with Us" that he posted on his website.[185][186][187] As the incumbent president, he ran virtually unopposed in the Democratic Party presidential primaries,[188] and on April 3, 2012, Obama secured the 2778 convention delegates needed to win the Democratic nomination.[189] At the Democratic National Convention in Charlotte, North Carolina, Obama and Joe Biden were formally nominated by former president Bill Clinton as the Democratic Party candidates for president and vice president in the general election. Their main opponents were Republicans Mitt Romney, the former governor of Massachusetts, and Representative Paul Ryan of Wisconsin.[190]
On November 6, 2012, Obama won 332 electoral votes, exceeding the 270 required for him to be reelected as president.[191][192][193] With 51.1 percent of the popular vote,[194] Obama became the first Democratic president since Franklin D. Roosevelt to win the majority of the popular vote twice.[195][196] Obama addressed supporters and volunteers at Chicago's McCormick Place after his reelection and said: "Tonight you voted for action, not politics as usual. You elected us to focus on your jobs, not ours. And in the coming weeks and months, I am looking forward to reaching out and working with leaders of both parties."[197][198]
The inauguration of Barack Obama as the 44th president took place on January 20, 2009. In his first few days in office, Obama issued executive orders and presidential memoranda directing the U.S. military to develop plans to withdraw troops from Iraq.[199] He ordered the closing of the Guantanamo Bay detention camp,[200] but Congress prevented the closure by refusing to appropriate the required funds[201][202] and preventing moving any Guantanamo detainee.[203] Obama reduced the secrecy given to presidential records.[204] He also revoked President George W. Bush's restoration of President Ronald Reagan's Mexico City policy which prohibited federal aid to international family planning organizations that perform or provide counseling about abortion.[205]
The first bill signed into law by Obama was the Lilly Ledbetter Fair Pay Act of 2009, relaxing the statute of limitations for equal-pay lawsuits.[206] Five days later, he signed the reauthorization of the State Children's Health Insurance Program to cover an additional four million uninsured children.[207] In March 2009, Obama reversed a Bush-era policy that had limited funding of embryonic stem cell research and pledged to develop "strict guidelines" on the research.[208]
Obama appointed two women to serve on the Supreme Court in the first two years of his presidency. He nominated Sonia Sotomayor on May 26, 2009, to replace retiring associate justiceDavid Souter. She was confirmed on August 6, 2009,[209] becoming the first Supreme Court Justice of Hispanic descent.[210] Obama nominated Elena Kagan on May 10, 2010, to replace retiring Associate Justice John Paul Stevens. She was confirmed on August 5, 2010, bringing the number of women sitting simultaneously on the Court to three for the first time in American history.[211]
In July 2009, Obama launched the Priority Enforcement Program, an immigration enforcement program that had been pioneered by George W. Bush, and the Secure Communities fingerprinting and immigration status data-sharing program.[212]
On January 16, 2013, one month after the Sandy Hook Elementary School shooting, Obama signed 23 executive orders and outlined a series of sweeping proposals regarding gun control.[214] He urged Congress to reintroduce an expired ban on military-style assault weapons, such as those used in several recent mass shootings, impose limits on ammunition magazines to 10 rounds, introduce background checks on all gun sales, pass a ban on possession and sale of armor-piercing bullets, introduce harsher penalties for gun-traffickers, especially unlicensed dealers who buy arms for criminals and approving the appointment of the head of the federal Bureau of Alcohol, Tobacco, Firearms and Explosives for the first time since 2006.[215] On January 5, 2016, Obama announced new executive actions extending background check requirements to more gun sellers.[216] In a 2016 editorial in The New York Times, Obama compared the struggle for what he termed "common-sense gun reform" to women's suffrage and other civil rights movements in American history.
In 2011, Obama signed a four-year renewal of the Patriot Act.[217] Following the 2013 global surveillance disclosures by whistleblowerEdward Snowden, Obama condemned the leak as unpatriotic,[218] but called for increased restrictions on the National Security Agency (NSA) to address violations of privacy.[219][220] Obama continued and expanded surveillance programs set up by George W. Bush, while implementing some reforms.[221] He supported legislation that would have limited the NSA's ability to collect phone records in bulk under a single program and supported bringing more transparency to the Foreign Intelligence Surveillance Court (FISC).[221]
In his speeches as president, Obama did not make more overt references to race relations than his predecessors,[222][223] but according to one study, he implemented stronger policy action on behalf of African-Americans than any president since the Nixon era.[224]
Following Obama's election, many pondered the existence of a "post-racial America".[225][226] However, lingering racial tensions quickly became apparent,[225][227] and many African-Americans expressed outrage over what they saw as an intense racial animosity directed at Obama.[228] The acquittal of George Zimmerman following the killing of Trayvon Martin sparked national outrage, leading to Obama giving a speech in which he said that "Trayvon Martin could have been me 35 years ago."[229] The shooting of Michael Brown in Ferguson, Missouri, sparked a wave of protests.[230] These and other events led to the birth of the Black Lives Matter movement, which campaigns against violence and systemic racism toward black people.[230] Though Obama entered office reluctant to talk about race, by 2014 he began openly discussing the disadvantages faced by many members of minority groups.[231]
Several incidents during Obama's presidency generated disapproval from the African-American community and with law enforcement, and Obama sought to build trust between law enforcement officials and civil rights activists, with mixed results. Some in law enforcement criticized Obama's condemnation of racial bias after incidents in which police action led to the death of African-American men, while some racial justice activists criticized Obama's expressions of empathy for the police.[232] In a March 2016 Gallup poll, nearly one third of Americans said they worried "a great deal" about race relations, a higher figure than in any previous Gallup poll since 2001.[233]
As a candidate for the Illinois state senate in 1996, Obama stated he favored legalizing same-sex marriage.[239] During his Senate run in 2004, he said he supported civil unions and domestic partnerships for same-sex partners but opposed same-sex marriages.[240] In 2008, he reaffirmed this position by stating "I believe marriage is between a man and a woman. I am not in favor of gay marriage."[241] On May 9, 2012, shortly after the official launch of his campaign for re-election as president, Obama said his views had evolved, and he publicly affirmed his personal support for the legalization of same-sex marriage, becoming the first sitting U.S. president to do so.[242][243] During his second inaugural address on January 21, 2013,[198] Obama became the first U.S. president in office to call for full equality for gay Americans, and the first to mention gay rights or the word "gay" in an inaugural address.[244][245] In 2013, the Obama administration filed briefs that urged the Supreme Court to rule in favor of same-sex couples in the cases of Hollingsworth v. Perry (regarding same-sex marriage)[246] and United States v. Windsor (regarding the Defense of Marriage Act).[247]
Obama intervened in the troubled automotive industry[251] in March 2009, renewing loans for General Motors (GM) and Chrysler to continue operations while reorganizing. Over the following months the White House set terms for both firms' bankruptcies, including the sale of Chrysler to Italian automaker Fiat[252] and a reorganization of GM giving the U.S. government a temporary 60 percent equity stake in the company.[253] In June 2009, dissatisfied with the pace of economic stimulus, Obama called on his cabinet to accelerate the investment.[254] He signed into law the Car Allowance Rebate System, known colloquially as "Cash for Clunkers", which temporarily boosted the economy.[255][256][257]
The Bush and Obama administrations authorized spending and loan guarantees from the Federal Reserve and the Department of the Treasury. These guarantees totaled about $11.5trillion, but only $3trillion had been spent by the end of November 2009.[258] On August 2, 2011, after a lengthy congressional debate over whether to raise the nation's debt limit, Obama signed the bipartisan Budget Control Act of 2011. The legislation enforced limits on discretionary spending until 2021, established a procedure to increase the debt limit, created a Congressional Joint Select Committee on Deficit Reduction to propose further deficit reduction with a stated goal of achieving at least $1.5trillion in budgetary savings over 10 years, and established automatic procedures for reducing spending by as much as $1.2trillion if legislation originating with the new joint select committee did not achieve such savings.[259] By passing the legislation, Congress was able to prevent a U.S. governmentdefault on its obligations.[260]
The unemployment rate rose in 2009, reaching a peak in October at 10.0 percent and averaging 10.0 percent in the fourth quarter. Following a decrease to 9.7 percent in the first quarter of 2010, the unemployment rate fell to 9.6 percent in the second quarter, where it remained for the rest of the year.[261] Between February and December 2010, employment rose by 0.8 percent, which was less than the average of 1.9 percent experienced during comparable periods in the past four employment recoveries.[262] By November 2012, the unemployment rate fell to 7.7 percent,[263] decreasing to 6.7 percent in the last month of 2013.[264] During 2014, the unemployment rate continued to decline, falling to 6.3 percent in the first quarter.[264] GDP growth returned in the third quarter of 2009, expanding at a rate of 1.6 percent, followed by a 5.0 percent increase in the fourth quarter.[265] Growth continued in 2010, posting an increase of 3.7 percent in the first quarter, with lesser gains throughout the rest of the year.[265] In July 2010, the Federal Reserve noted that economic activity continued to increase, but its pace had slowed, and chairman Ben Bernanke said the economic outlook was "unusually uncertain".[266] Overall, the economy expanded at a rate of 2.9 percent in 2010.[267]
Job growth during the presidency of Obama compared to other presidents, as measured as a cumulative percentage change from month after inauguration to end of his term
The Congressional Budget Office (CBO) and a broad range of economists credit Obama's stimulus plan for economic growth.[269][270] The CBO released a report stating that the stimulus bill increased employment by 1–2.1million,[270][271][272] while conceding that "it is impossible to determine how many of the reported jobs would have existed in the absence of the stimulus package."[269] Although an April 2010, survey of members of the National Association for Business Economics showed an increase in job creation (over a similar January survey) for the first time in two years, 73 percent of 68 respondents believed the stimulus bill has had no impact on employment.[273] The economy of the United States has grown faster than the other original NATO members by a wider margin under President Obama than it has anytime since the end of World War II.[274] The Organisation for Economic Co-operation and Development credits the much faster growth in the United States to the stimulus plan of the U.S. and the austerity measures in the European Union.[275]
Within a month of the 2010 midterm elections, Obama announced a compromise deal with the Congressional Republican leadership that included a temporary, two-year extension of the 2001 and 2003 income tax rates, a one-year payroll tax reduction, continuation of unemployment benefits, and a new rate and exemption amount for estate taxes.[276] The compromise overcame opposition from some in both parties, and the resulting $858billion (equivalent to $1.2 trillion in 2024) Tax Relief, Unemployment Insurance Reauthorization, and Job Creation Act of 2010 passed with bipartisan majorities in both houses of Congress before Obama signed it on December 17, 2010.[277]
On April 20, 2010, an explosion destroyed an offshore drilling rig at the Macondo Prospect in the Gulf of Mexico, causing a major sustained oil leak. Obama visited the Gulf, announced a federal investigation, and formed a bipartisan commission to recommend new safety standards, after a review by Secretary of the InteriorKen Salazar and concurrent Congressional hearings. He then announced a six-month moratorium on new deepwater drilling permits and leases, pending regulatory review.[280] As multiple efforts by BP failed, some in the media and public expressed confusion and criticism over various aspects of the incident, and stated a desire for more involvement by Obama and the federal government.[281] Prior to the oil spill, on March 31, 2010, Obama ended a ban on oil and gas drilling along the majority of the East Coast of the United States and along the coast of northern Alaska in an effort to win support for an energy and climate bill and to reduce foreign imports of oil and gas.[282]
In July 2013, Obama expressed reservations and said he "would reject the Keystone XL pipeline if it increased carbon pollution [or] greenhouse emissions."[283][284] On February 24, 2015, Obama vetoed a bill that would have authorized the pipeline.[285] It was the third veto of Obama's presidency and his first major veto.[286]
In December 2016, Obama permanently banned new offshore oil and gas drilling in most United States-owned waters in the Atlantic and Arctic Oceans using the 1953 Outer Continental Shelf Act.[287][288][289]
Obama emphasized the conservation of federal lands during his term in office. He used his power under the Antiquities Act to create 25 new national monuments during his presidency and expand four others, protecting a total of 553,000,000 acres (224,000,000 ha) of federal lands and waters, more than any other U.S. president.[290][291][292]
Obama called for Congress to pass legislation reforming health care in the United States, a key campaign promise and a top legislative goal.[293] He proposed an expansion of health insurance coverage to cover the uninsured, cap premium increases, and allow people to retain their coverage when they leave or change jobs. His proposal was to spend $900billion over ten years and include a government insurance plan, also known as the public option, to compete with the corporate insurance sector as a main component to lowering costs and improving quality of health care. It would also make it illegal for insurers to drop sick people or deny them coverage for pre-existing conditions, and require every American to carry health coverage. The plan also includes medical spending cuts and taxes on insurance companies that offer expensive plans.[294][295]
On July 14, 2009, House Democratic leaders introduced a 1,017-page plan for overhauling the U.S. health care system, which Obama wanted Congress to approve by the end of 2009.[293] After public debate during the Congressional summer recess of 2009, Obama delivered a speech to a joint session of Congress on September 9 where he addressed concerns over the proposals.[297] In March 2009, Obama lifted a ban on using federal funds for stem cell research.[298]
On November 7, 2009, a health care bill featuring the public option was passed in the House.[299][300] On December 24, 2009, the Senate passed its own bill—without a public option—on a party-line vote of 60–39.[301] On March 21, 2010, the Patient Protection and Affordable Care Act (ACA, colloquially "Obamacare") passed by the Senate in December was passed in the House by a vote of 219 to 212. Obama signed the bill into law on March 23, 2010.[302]
The ACA includes health-related provisions, most of which took effect in 2014, including expanding Medicaid eligibility for people making up to 133 percentof the federal poverty level (FPL) starting in 2014,[303] subsidizing insurance premiums for people making up to 400 percentof the FPL ($88,000 for family of four in 2010) so their maximum "out-of-pocket" payment for annual premiums will be from 2 percent to 9.5 percent of income,[304] providing incentives for businesses to provide health care benefits, prohibiting denial of coverage and denial of claims based on pre-existing conditions, establishing health insurance exchanges, prohibiting annual coverage caps, and support for medical research. According to White House and CBO figures, the maximum share of income that enrollees would have to pay would vary depending on their income relative to the federal poverty level.[305]
Percentage of Individuals in the United States without Health Insurance, 1963–2015 (source: JAMA)[306]
The costs of these provisions are offset by taxes, fees, and cost-saving measures, such as new Medicare taxes for those in high-income brackets, taxes on indoor tanning, cuts to the Medicare Advantage program in favor of traditional Medicare, and fees on medical devices and pharmaceutical companies;[307] there is also a tax penalty for those who do not obtain health insurance, unless they are exempt due to low income or other reasons.[308] In March 2010, the CBO estimated that the net effect of both laws will be a reduction in the federal deficit by $143billion over the first decade.[309]
The law faced several legal challenges, primarily based on the argument that an individual mandate requiring Americans to buy health insurance was unconstitutional. On June 28, 2012, the Supreme Court ruled by a 5–4 vote in National Federation of Independent Business v. Sebelius that the mandate was constitutional under the U.S. Congress's taxing authority.[310] In Burwell v. Hobby Lobby the Court ruled that "closely-held" for-profit corporations could be exempt on religious grounds under the Religious Freedom Restoration Act from regulations adopted under the ACA that would have required them to pay for insurance that covered certain contraceptives. In June 2015, the Court ruled 6–3 in King v. Burwell that subsidies to help individuals and families purchase health insurance were authorized for those doing so on both the federal exchange and state exchanges, not only those purchasing plans "established by the State", as the statute reads.[311]
In February and March 2009, Vice President Joe Biden and Secretary of State Hillary Clinton made separate overseas trips to announce a "new era" in U.S. foreign relations with Russia and Europe, using the terms "break" and "reset" to signal major changes from the policies of the preceding administration.[312] Obama attempted to reach out to Arab leaders by granting his first interview to an Arab satellite TV network, Al Arabiya.[313] On March 19, Obama continued his outreach to the Muslim world, releasing a New Year's video message to the people and government of Iran.[314][315] On June 4, 2009, Obama delivered a speech at Cairo University in Egypt calling for "A New Beginning" in relations between the Islamic world and the United States and promoting Middle East peace.[316] On June 26, 2009, Obama condemned the Iranian government's actions towards protesters following Iran's 2009 presidential election.[317]
In 2011, Obama ordered a drone strike in Yemen which targeted and killed Anwar al-Awlaki, an American imam suspected of being a leading Al-Qaeda organizer. al-Awlaki became the first U.S. citizen to be targeted and killed by a U.S. drone strike. The Department of Justice released a memo justifying al-Awlaki's death as a lawful act of war,[318] while civil liberties advocates described it as a violation of al-Awlaki's constitutional right to due process. The killing led to significant controversy.[319] His teenage son and young daughter, also Americans, were later killed in separate U.S. military actions, although they were not targeted specifically.[320][318]
In March 2015, Obama declared that he had authorized U.S. forces to provide logistical and intelligence support to the Saudis in their military intervention in Yemen, establishing a "Joint Planning Cell" with Saudi Arabia.[321][322] In 2016, the Obama administration proposed a series of arms deals with Saudi Arabia worth $115billion.[323] Obama halted the sale of guided munition technology to Saudi Arabia after Saudi warplanes targeted a funeral in Yemen's capital Sanaa, killing more than 140 people.[324]
On February 27, 2009, Obama announced that combat operations in Iraq would end within 18 months.[326] The Obama administration scheduled the withdrawal of combat troops to be completed by August 2010, decreasing troop's levels from 142,000 while leaving a transitional force of about 50,000 in Iraq until the end of 2011. On August 19, 2010, the last U.S. combat brigade exited Iraq. Remaining troops transitioned from combat operations to counter-terrorism and the training, equipping, and advising of Iraqi security forces.[327][328] On August 31, 2010, Obama announced that the United States combat mission in Iraq was over.[329] On October 21, 2011, President Obama announced that all U.S. troops would leave Iraq in time to be "home for the holidays."[330]
In June 2014, following the capture of Mosul by ISIL, Obama sent 275 troops to provide support and security for U.S. personnel and the U.S. Embassy in Baghdad. ISIS continued to gain ground and to commit widespread massacres and ethnic cleansing.[331][332] In August 2014, during the Sinjar massacre, Obama ordered a campaign of U.S. airstrikes against ISIL.[333] By the end of 2014, 3,100 American ground troops were committed to the conflict[334] and 16,000 sorties were flown over the battlefield, primarily by U.S. Air Force and Navy pilots.[335] In early 2015, with the addition of the "Panther Brigade" of the 82nd Airborne Division the number of U.S. ground troops in Iraq increased to 4,400,[336] and by July American-led coalition air forces counted 44,000 sorties over the battlefield.[337]
Obama after a trilateral meeting with Afghan president Hamid Karzai (left) and Pakistani president Asif Ali Zardari (right), May 2009
In his election campaign, Obama called the war in Iraq a "dangerous distraction" and that emphasis should instead be put on the war in Afghanistan,[338] the region he cites as being most likely where an attack against the United States could be launched again.[339] Early in his presidency, Obama moved to bolster U.S. troop strength in Afghanistan. He announced an increase in U.S. troop levels to 17,000 military personnel in February 2009 to "stabilize a deteriorating situation in Afghanistan", an area he said had not received the "strategic attention, direction and resources it urgently requires."[340] He replaced the military commander in Afghanistan, General David D. McKiernan, with former Special Forces commander Lt. Gen. Stanley A. McChrystal in May 2009, indicating that McChrystal's Special Forces experience would facilitate the use of counterinsurgency tactics in the war.[341] On December 1, 2009, Obama announced the deployment of an additional 30,000 military personnel to Afghanistan and proposed to begin troop withdrawals 18 months from that date;[342] this took place in July 2011. David Petraeus replaced McChrystal in June 2010, after McChrystal's staff criticized White House personnel in a magazine article.[343] In February 2013, Obama said the U.S. military would reduce the troop level in Afghanistan from 68,000 to 34,000 U.S. troops by February 2014.[344] In October 2015, the White House announced a plan to keep U.S. Forces in Afghanistan indefinitely in light of the deteriorating security situation.[345]
Regarding neighboring Pakistan, Obama called its tribal border region the "greatest threat" to the security of Afghanistan and Americans, saying that he "cannot tolerate a terrorist sanctuary." In the same speech, Obama claimed that the U.S. "cannot succeed in Afghanistan or secure our homeland unless we change our Pakistan policy."[346]
Starting with information received from Central Intelligence Agency operatives in July 2010, the CIA developed intelligence over the next several months that determined what they believed to be the hideout of Osama bin Laden. He was living in seclusion in a large compound in Abbottabad, Pakistan, a suburban area 35 miles (56 km) from Islamabad.[347] CIA head Leon Panetta reported this intelligence to President Obama in March 2011.[347] Meeting with his national security advisers over the course of the next six weeks, Obama rejected a plan to bomb the compound, and authorized a "surgical raid" to be conducted by United States Navy SEALs.[347] The operation took place on May 1, 2011, and resulted in the shooting death of bin Laden and the seizure of papers, computer drives and disks from the compound.[348][349] DNA testing was one of five methods used to positively identify bin Laden's corpse,[350] which was buried at sea several hours later.[351] Within minutes of the President's announcement from Washington, DC, late in the evening on May 1, there were spontaneous celebrations around the country as crowds gathered outside the White House, and at New York City's Ground Zero and Times Square.[348][352]Reaction to the announcement was positive across party lines, including from former presidents Bill Clinton and George W. Bush.[353]
Obama meeting with Cuban president Raúl Castro in Panama, April 2015
Since the spring of 2013, secret meetings were conducted between the United States and Cuba in the neutral locations of Canada and Vatican City.[354] The Vatican first became involved in 2013 when Pope Francis advised the U.S. and Cuba to exchange prisoners as a gesture of goodwill.[355] On December 10, 2013, Cuban President Raúl Castro, in a significant public moment, greeted and shook hands with Obama at the Nelson Mandela memorial service in Johannesburg.[356]
In December 2014, after the secret meetings, it was announced that Obama, with Pope Francis as an intermediary, had negotiated a restoration of relations with Cuba, after nearly sixty years of détente.[357] Popularly dubbed the Cuban Thaw, The New Republic deemed the Cuban Thaw to be "Obama's finest foreign policy achievement."[358] On July 1, 2015, President Obama announced that formal diplomatic relations between Cuba and the United States would resume, and embassies would be opened in Washington and Havana.[359] The countries' respective "interests sections" in one another's capitals were upgraded to embassies on July 20 and August 13, 2015, respectively.[360] Obama visited Havana, Cuba for two days in March 2016, becoming the first sitting U.S. president to arrive since Calvin Coolidge in 1928.[361]
During the initial years of the Obama administration, the U.S. increased military cooperation with Israel, including increased military aid, re-establishment of the U.S.–Israeli Joint Political Military Group and the Defense Policy Advisory Group, and an increase in visits among high-level military officials of both countries.[362] The Obama administration asked Congress to allocate money toward funding the Iron Dome program in response to the waves of Palestinian rocket attacks on Israel.[363] In March 2010, Obama took a public stance against plans by the government of Israeli prime minister Benjamin Netanyahu to continue building Jewish housing projects in predominantly Arab neighborhoods of East Jerusalem.[364][365] In 2011, the United States vetoed a Security Council resolution condemning Israeli settlements, with the United States being the only nation to do so.[366] Obama supports the two-state solution to the Arab–Israeli conflict based on the 1967 borders with land swaps.[367]
In 2013, Jeffrey Goldberg reported that, in Obama's view, "with each new settlement announcement, Netanyahu is moving his country down a path toward near-total isolation."[368] In 2014, Obama likened the Zionist movement to the civil rights movement in the United States. He said both movements seek to bring justice and equal rights to historically persecuted peoples, explaining: "To me, being pro-Israel and pro-Jewish is part and parcel with the values that I've been fighting for since I was politically conscious and started getting involved in politics."[369] Obama expressed support for Israel's right to defend itself during the 2014 Israel–Gaza conflict.[370] In 2015, Obama was harshly criticized by Israel for advocating and signing the Iran Nuclear Deal; Israeli prime minister Benjamin Netanyahu, who had advocated the U.S. congress to oppose it, said the deal was "dangerous" and "bad."[371]
On December 23, 2016, under the Obama administration, the United States abstained from United Nations Security Council Resolution 2334, which condemned Israeli settlement building in the occupied Palestinian territories as a violation of international law, effectively allowing it to pass.[372] Netanyahu strongly criticized the Obama administration's actions,[373][374] and the Israeli government withdrew its annual dues from the organization, which totaled $6million, on January 6, 2017.[375] On January 5, 2017, the United States House of Representatives voted 342–80 to condemn the UN Resolution.[376][377]
In February 2011, protests in Libya began against long-time dictator Muammar Gaddafi as part of the Arab Spring. They soon turned violent. In March, as forces loyal to Gaddafi advanced on rebels across Libya, calls for a no-fly zone came from around the world, including Europe, the Arab League, and a resolution[378] passed unanimously by the U.S. Senate.[379] In response to the passage of United Nations Security Council Resolution 1973 on March 17, the Foreign Minister of Libya Moussa Koussa announced a ceasefire. However Gaddafi's forces continued to attack the rebels.[380]
On March 19, a multinational coalition led by France and the United Kingdom with Italian and U.S. support, approved by Obama, took part in air strikes to destroy the Libyan government's air defense capabilities to protect civilians and enforce a no-fly-zone,[381] including the use of Tomahawk missiles, B-2 Spirits, and fighter jets.[382][383][384] Six days later, on March 25, by unanimous vote of all its 28 members, NATO took over leadership of the effort, dubbed Operation Unified Protector.[385] Some members of Congress[386] questioned whether Obama had the constitutional authority to order military action in addition to questioning its cost, structure and aftermath.[387][388] In 2016 Obama said "Our coalition could have and should have done more to fill a vacuum left behind" and that it was "a mess".[389] He has stated that the lack of preparation surrounding the days following the government's overthrow was the "worst mistake" of his presidency.[390]
On August 18, 2011, several months after the start of the Syrian civil war, Obama issued a written statement that said: "The time has come for President Assad to step aside."[391] This stance was reaffirmed in November 2015.[392] In 2012, Obama authorized multiple programs run by the CIA and the Pentagon to train anti-Assad rebels.[393] The Pentagon-run program was later found to have failed and was formally abandoned in October 2015.[394][395]
On October 1, 2009, the Obama administration went ahead with a Bush administration program, increasing nuclear weapons production. The "Complex Modernization" initiative expanded two existing nuclear sites to produce new bomb parts. In November 2013, the Obama administration opened negotiations with Iran to prevent it from acquiring nuclear weapons, which included an interim agreement. Negotiations took two years with numerous delays, with a deal being announced on July 14, 2015. The deal titled the "Joint Comprehensive Plan of Action" saw sanctions removed in exchange for measures that would prevent Iran from producing nuclear weapons. While Obama hailed the agreement as being a step towards a more hopeful world, the deal drew strong criticism from Republican and conservative quarters, and from Israeli Prime Minister Benjamin Netanyahu.[400][401][402] In addition, the transfer of $1.7billion in cash to Iran shortly after the deal was announced was criticized by the Republican party. The Obama administration said that the payment in cash was because of the "effectiveness of U.S. and international sanctions."[403] In order to advance the deal, the Obama administration shielded Hezbollah from the Drug Enforcement Administration's Project Cassandra investigation regarding drug smuggling and from the Central Intelligence Agency.[404][405]
On a side note, the very same year, in December 2015, Obama started a $348billion worth program to back the biggest U.S. buildup of nuclear arms since Ronald Reagan left the White House.[406]
Obama meets Russian president Vladimir Putin in September 2015
In March 2010, an agreement was reached with the administration of Russian President Dmitry Medvedev to replace the 1991 Strategic Arms Reduction Treaty with a new pact reducing the number of long-range nuclear weapons in the arsenals of both countries by about a third.[407] Obama and Medvedev signed the New START treaty in April 2010, and the U.S. Senate ratified it in December 2010.[408] In December 2011, Obama instructed agencies to consider LGBT rights when issuing financial aid to foreign countries.[409] In August 2013, he criticized Russia's law that discriminates against homosexual people,[410] but he stopped short of advocating a boycott of the upcoming 2014 Winter Olympics in Sochi, Russia.[411]
Obama's family history, upbringing, and Ivy League education differ markedly from those of African-American politicians who rose to prominence in the 1960s through their involvement in the civil rights movement.[414] Expressing puzzlement over questions about whether he is "black enough", Obama told an August 2007 meeting of the National Association of Black Journalists that "we're still locked in this notion that if you appeal to white folks then there must be something wrong."[415] Obama acknowledged his youthful image in an October 2007 campaign speech, remarking: "I wouldn't be here if, time and again, the torch had not been passed to a new generation."[416] Additionally, Obama has frequently been referred to as an exceptional orator.[417] During his pre-inauguration transition period and continuing into his presidency, Obama delivered a series of weekly video addresses on YouTube.[418]
According to the Gallup Organization, Obama began his presidency with a 68 percent approval rating,[419] the fifth highest for a president following their swearing in.[420] His ratings remained above the majority level until November 2009[421] and by August 2010 his approval was in the low 40s,[422] a trend similar to Ronald Reagan's and Bill Clinton's first years in office.[423] Following the death of Osama bin Laden on May 2, 2011, Obama experienced a small poll bounce and steadily maintained 50–53 percent approval for about a month, until his approval numbers dropped back to the low 40s.[424][425][426]
His approval rating fell to 38 percent on several occasions in late 2011[427] before recovering in mid-2012 with polls showing an average approval of 50 percent.[428] After his second inauguration in 2013, Obama's approval ratings remained stable around 52 percent[429] before declining for the rest of the year and eventually bottoming out at 39 percent in December.[424] In polling conducted before the 2014 midterm elections, Obama's approval ratings were at their lowest[430][431] with his disapproval rating reaching a high of 57 percent.[424][432][433] His approval rating continued to lag throughout most of 2015 but began to reach the high 40s by the end of the year.[424][434] According to Gallup, Obama's approval rating reached 50 percent in March 2016, a level unseen since May 2013.[424][435] In polling conducted January 16–19, 2017, Obama's final approval rating was 59 percent, which placed him on par with George H. W. Bush and Dwight D. Eisenhower, whose final Gallup ratings also measured in the high 50s.[436]
Obama has maintained relatively positive public perceptions after his presidency.[437] In Gallup's retrospective approval polls of former presidents, Obama garnered a 63 percent approval rating in 2018 and again in 2023, ranking him the fourth most popular president since World War II.[438][439]
Polls showed strong support for Obama in other countries both before and during his presidency.[440][441][442] In a February 2009 poll conducted in Western Europe and the U.S. by Harris Interactive for France 24 and the International Herald Tribune, Obama was rated as the most respected world leader, as well as the most powerful.[443] In a similar poll conducted by Harris in May 2009, Obama was rated as the most popular world leader, as well as the one figure most people would pin their hopes on for pulling the world out of the economic downturn.[444][445]
On October 9, 2009—only nine months into his first term—the Norwegian Nobel Committee announced that Obama had won the 2009 Nobel Peace Prize "for his extraordinary efforts to strengthen international diplomacy and cooperation between peoples",[446] which drew a mixture of praise and criticism from world leaders and media figures.[447][448][449][450] He became the fourth U.S. president to be awarded the Nobel Peace Prize, and the third to become a Nobel laureate while in office.[451] He himself called it a "call to action" and remarked: "I do not view it as a recognition of my own accomplishments but rather an affirmation of American leadership on behalf of aspirations held by people in all nations".[452]
In 2009, the saying "thanks, Obama" first appeared in a Twitter hashtag, "#thanks Obama", and was later used in a demotivational poster. It was later adopted satirically to blame Obama for any socio-economic ills. Obama himself used the phrase in video in 2015 and 2016. In 2017, the phrase was used by Stephen Colbert to express gratitude to Obama on his last day in office. In 2022, President Joe Biden's Twitter account posted the phrase.
Obama playing golf with Argentinian president Mauricio Macri, October 2017
Obama's presidency ended on January 20, 2017, upon the inauguration of his successor, Donald Trump.[453][454] The family moved to a house they rented in Kalorama, Washington, D.C.[455] On March 2, the John F. Kennedy Presidential Library and Museum awarded the Profile in Courage Award to Obama "for his enduring commitment to democratic ideals and elevating the standard of political courage."[456] His first public appearance since leaving the office was a seminar at the University of Chicago on April 24, where he appealed for a new generation to participate in politics.[457] On September 7, Obama partnered with former presidents Jimmy Carter, George H. W. Bush, Bill Clinton, and George W. Bush to work with One America Appeal to help the victims of Hurricane Harvey and Hurricane Irma in the Gulf Coast and Texas communities.[458] From October 31 to November 1, Obama hosted the inaugural summit of the Obama Foundation,[459] which he intended to be the central focus of his post-presidency and part of his ambitions for his subsequent activities following his presidency to be more consequential than his time in office.[460]
Obama was reluctant to make an endorsement in the 2020 Democratic presidential primaries because he wanted to position himself to unify the party, regardless of the nominee.[468] On April 14, 2020, Obama endorsed Biden, the presumptive nominee, for president in the presidential election, stating that he has "all the qualities we need in a president right now."[469][470] In May, Obama criticized President Trump for his handling of the COVID-19 pandemic, calling his response to the crisis "an absolute chaotic disaster", and stating that the consequences of the Trump presidency have been "our worst impulses unleashed, our proud reputation around the world badly diminished, and our democratic institutions threatened like never before."[471] On November 17, Obama's presidential memoir, A Promised Land, was released.[472][473][474]
In February 2021, Obama and musician Bruce Springsteen started a podcast called Renegades: Born in the USA where the two talk about "their backgrounds, music and their 'enduring love of America.'"[475][476] Later that year, Regina Hicks had signed a deal with Netflix, in a venture with his and Michelle's Higher Ground to develop comedy projects.[477]
Obama with President Joe Biden and Vice President Kamala Harris in the White House, April 5, 2022
On March 4, 2022, Obama won an Audio Publishers Association (APA) Award in the best narration by the author category for the narration of his memoir A Promised Land.[478] On April 5, Obama visited the White House for the first time since leaving office, in an event celebrating the 12th annual anniversary of the signing of the Affordable Care Act.[479][480][481] In June, it was announced that the Obamas and their podcast production company, Higher Ground, signed a multi-year deal with Audible.[482][483] In September, Obama visited the White House to unveil his and Michelle's official White House portraits.[484] Around the same time, he won a Primetime Emmy Award for Outstanding Narrator[485] for his narration in the Netflix documentary series Our Great National Parks.[486]
In 2022, Obama opposed expanding the Supreme Court beyond the present nine Justices.[487]
In March 2023, Obama traveled to Australia as a part of his speaking tour of the country. During the trip, Obama met with Australian prime minister Anthony Albanese and visited Melbourne for the first time.[488] Obama was reportedly paid more than $1 million for two speeches.[489][490]
In October 2023, during the Gaza war, Obama declared that Israel must dismantle Hamas in the wake of the Hamas-led attack on Israel.[491] Weeks later, Obama warned Israel that its actions could "harden Palestinian attitudes for generations" and weaken international support for Israel; any military strategy that ignored the war's human costs "could ultimately backfire."[492]
In July 2024, Obama expressed concerns about Biden's campaign viability after his critically maligned debate performance against former president Trump.[493] On July 21, Biden withdrew his candidacy and swiftly endorsed Vice President Harris right after to run as the Democratic nominee. Obama endorsed Harris alongside his wife Michelle five days later and delivered a speech at the 2024 Democratic National Convention formally endorsing her.[494] He joined Harris on the campaign trail in October, traveling to various swing states and emphasizing her record as a prosecutor, senator, and vice president and advocating for increased voter turnout, and his criticisms of Donald Trump and the Republican Party were widely reported by various media outlets.[495][496] After Trump was declared the winner of the election on November 6, Obama and Michelle congratulated him and Vice President–elect JD Vance while praising the Harris campaign and calling on liberal voters to continue supporting democracy and human rights.[497]
Obama has been described as one of the most effective campaigners in American history (his 2008 campaign being particularly highlighted) as well as one of the most talented political orators of the 21st century.[498][499][500] Historian Julian Zelizer credits Obama with "a keen sense of how the institutions of government work and the ways that his team could design policy proposals." Zeitzer notes Obama's policy successes included the economic stimulus package which ended the Great Recession and the Dodd-Frank financial and consumer protection reforms, as well as the Affordable Care Act. Zeitzer also notes the Democratic Party lost power and numbers of elected officials during Obama's term, saying that the consensus among historians is that Obama "turned out to be a very effective policymaker but not a tremendously successful party builder." Zeitzer calls this the "defining paradox of Obama's presidency".[501]
The Brookings Institution noted that Obama passed "only one major legislative achievement (Obamacare)—and a fragile one at that—the legacy of Obama's presidency mainly rests on its tremendous symbolic importance and the fate of a patchwork of executive actions."[502] David W. Wise noted that Obama fell short "in areas many Progressives hold dear", including the continuation of drone strikes, not going after big banks during the Great Recession, and failing to strengthen his coalition before pushing for Obamacare. Wise called Obama's legacy that of "a disappointingly conventional president".[503]
Obama's most significant accomplishment is generally considered to be the Affordable Care Act (ACA), provisions of which went into effect from 2010 to 2020. Many attempts by Senate Republicans to repeal the ACA, including a "skinny repeal", have thus far failed.[504] However, in 2017, the penalty for violating the individual mandate was repealed effective 2019.[505] Together with the Health Care and Education Reconciliation Act amendment, it represents the U.S. healthcare system's most significant regulatory overhaul and expansion of coverage since the passage of Medicare and Medicaid in 1965.[506][507][508][509]
Many commentators credit Obama with averting a threatened depression and pulling the economy back from the Great Recession.[504] According to the U.S. Bureau of Labor Statistics, the Obama administration created 11.3 million jobs from the month after his first inauguration to the end of his second term.[510] In 2010, Obama signed into effect the Dodd–Frank Wall Street Reform and Consumer Protection Act. Passed as a response to the 2008 financial crisis, it brought the most significant changes to financial regulation in the United States since the regulatory reform that followed the Great Depression under Democratic President Franklin D. Roosevelt.[511]
In 2009, Obama signed into law the National Defense Authorization Act for Fiscal Year 2010, which contained in it the Matthew Shepard and James Byrd Jr. Hate Crimes Prevention Act, the first addition to existing federal hate crime law in the United States since Democratic President Bill Clinton signed into law the Church Arson Prevention Act of 1996. The act expanded existing federal hate crime laws in the United States, and made it a federal crime to assault people based on sexual orientation, gender identity, or disability.[512]
As president, Obama advanced LGBT rights.[513] In 2010, he signed the Don't Ask, Don't Tell Repeal Act, which brought an end to "don't ask, don't tell" policy in the U.S. armed forces that banned open service from LGBT people; the law went into effect the following year.[514] In 2016, his administration brought an end to the ban on transgender people serving openly in the U.S. armed forces.[515][238] A Gallup poll, taken in the final days of Obama's term, showed that 68 percent of Americans believed the U.S. had made progress on LGBT rights during Obama's eight years in office.[516]
Obama substantially escalated the use of drone strikes against suspected militants and terrorists associated with al-Qaeda and the Taliban.[517] In 2016, the last year of his presidency, the U.S. dropped 26,171 bombs on seven different countries.[518][519] Obama left about 8,400 U.S. troops in Afghanistan, 5,262 in Iraq, 503 in Syria, 133 in Pakistan, 106 in Somalia, seven in Yemen, and two in Libya at the end of his presidency.[520]
According to Pew Research Center and United States Bureau of Justice Statistics, from December 31, 2009, to December 31, 2015, inmates sentenced in U.S. federal custody declined by five percent. This is the largest decline in sentenced inmates in U.S. federal custody since Democratic president Jimmy Carter. By contrast, the federal prison population increased significantly under presidents Ronald Reagan, George H. W. Bush, Bill Clinton, and George W. Bush.[521]
Human Rights Watch (HRW) called Obama's human rights record "mixed", adding that "he has often treated human rights as a secondary interest—nice to support when the cost was not too high, but nothing like a top priority he championed."[221]
Obama left office in January 2017 with a 60 percent approval rating.[522][523] He gained 10 spots from the same survey in 2015 from the Brookings Institution that ranked him the 18th-greatest American president.[524] In Gallup's 2018 job approval poll for the past 10 U.S. presidents, he received a 63 percent approval rating.[438]
The Barack Obama Presidential Center is Obama's planned presidential library. It will be hosted by the University of Chicago and located in Jackson Park on the South Side of Chicago.[525]
——————— (1990). "Tort Law. Prenatal Injuries. Supreme Court of Illinois Refuses to Recognize Cause of Action Brought by Fetus Against Its Mother for Unintentional Infliction of Prenatal Injuries. Stallman v. Youngquist, 125 Ill. 2d 267, 531 N. E.2d 355 (1988)". Harvard Law Review. 103 (3): 823–828. doi:10.2307/1341352. JSTOR1341352. Uncredited case comment.[527]
^Barreto, Amílcar Antonio; O'Bryant, Richard L. (November 12, 2013). "Introduction". American Identity in the Age of Obama. Taylor & Francis. pp. 18–19. ISBN978-1-317-93715-9. Retrieved May 8, 2017.
Merida, Kevin (December 14, 2007). "The ghost of a father". The Washington Post. p. A12. Archived from the original on August 29, 2008. Retrieved June 25, 2008.
^Reyes, B.J. (February 8, 2007). "Punahou left lasting impression on Obama". Honolulu Star-Bulletin. Archived from the original on March 28, 2019. Retrieved February 10, 2007. As a teenager, Obama went to parties and sometimes sought out gatherings on military bases or at the University of Hawaii that were attended mostly by blacks.
for analysis of the political impact of the quote and Obama's more recent admission that he smoked marijuana as a teenager ("When I was a kid, I inhaled"), see:
Possley, Maurice (March 30, 2007). "Activism blossomed in college". Chicago Tribune. p. 20. Archived from the original on October 9, 2010. Retrieved May 12, 2010.
Secter, Bob; McCormick, John (March 30, 2007). "Portrait of a pragmatist". Chicago Tribune. p. 1. Archived from the original on December 14, 2009. Retrieved May 18, 2012.
^ abcMatchan, Linda (February 15, 1990). "A Law Review breakthrough". The Boston Globe. p. 29. Archived from the original on January 22, 2009. Retrieved June 15, 2008.
Corr, John (February 27, 1990). "From mean streets to hallowed halls"(paid archive). The Philadelphia Inquirer. p. C01. Archived from the original on August 28, 2019. Retrieved June 6, 2008.
^Obama, Barack (August–September 1988). "Why organize? Problems and promise in the inner city". Illinois Issues. Vol. 14, no. 8–9. pp. 40–42. ISSN0738-9663. reprinted in: Knoepfle, Peg, ed. (1990). After Alinsky: community organizing in Illinois. Springfield, IL: Sangamon State University. pp. 35–40. ISBN978-0-9620873-3-2. He has also been a consultant and instructor for the Gamaliel Foundation, an organizing institute working throughout the Midwest.
^Obama, Auma (2012). And then life happens: a memoir. New York: St. Martin's Press. pp. 189–208, 212–216. ISBN978-1-250-01005-6.
^Joey Del Ponte; Somerville Scout Staff. "Something in the Water". Somerville Scout. No. January/February 2014. p. 26. Archived from the original on January 1, 2020. Retrieved January 1, 2020. Barack Obama lived in the big, ivy-covered brick building at 365 Broadway... From 1988 to 1991, the future president resided in a basement apartment while attending Harvard Law School.
^ abLevenson, Michael; Saltzman, Jonathan (January 28, 2007). "At Harvard Law, a unifying voice". Boston Globe. p. 1A. Archived from the original on August 3, 2016. Retrieved June 15, 2008.
Mundy, Liza (August 12, 2007). "A series of fortunate events". The Washington Post. p. W10. Archived from the original on August 14, 2007. Retrieved June 15, 2008.
Starr, Alexandra (September 21, 2008). "Case study". The New York Times Magazine. p. 76. Retrieved January 30, 2010.
Hundley, Tom (March 22, 2009). "Ivory tower of power". Chicago Tribune Magazine. p. 6. Archived from the original on April 13, 2010. Retrieved January 30, 2010.
Reynolds, Gretchen (January 1993). "Vote of confidence". Chicago Magazine. Vol. 42, no. 1. pp. 53–54. ISSN0362-4595. Archived from the original on May 14, 2008. Retrieved June 6, 2008.
Anderson, Veronica (October 3, 1993). "40 under Forty: Barack Obama, Director, Illinois Project Vote". Crain's Chicago Business. Vol. 16, no. 39. p. 43. ISSN0149-6956.
^Branigin, William (January 30, 2009). "Steelers Win Obama's Approval". The Washington Post. Archived from the original on August 5, 2017. Retrieved August 21, 2017. But other than the Bears, the Steelers are probably the team that's closest to my heart.
^Kantor, Jodi (June 1, 2007). "One Place Where Obama Goes Elbow to Elbow". The New York Times. Archived from the original on April 1, 2009. Retrieved April 28, 2008. See also: "The Love of the Game"(video). Real Sports with Bryant Gumbel. HBO. April 15, 2008. Archived from the original on October 16, 2011. Retrieved October 12, 2011.
^Stolberg, Sheryl Gay; Kirkpatrick, David D.; Shane, Scott (January 22, 2009). "On First Day, Obama Quickly Sets a New Tone". The New York Times. p. 1. Archived from the original on January 23, 2009. Retrieved September 7, 2012.
^"American President: Barack Obama". Miller Center of Public Affairs, University of Virginia. 2009. Archived from the original on January 23, 2009. Retrieved January 23, 2009. Religion: Christian
"The Truth about Barack's Faith"(PDF). Obama for America. Archived from the original on January 5, 2011. Retrieved July 1, 2012.
Miller, Lisa (July 18, 2008). "Finding his faith". Newsweek. Archived from the original on February 6, 2010. Retrieved February 4, 2010. He is now a Christian, having been baptized in the early 1990s at Trinity United Church of Christ in Chicago.
Sullivan, Amy (June 29, 2009). "The Obama's find a church home—away from home". Time. Archived from the original on April 4, 2010. Retrieved February 5, 2010. instead of joining a congregation in Washington, D.C., he will follow in George W. Bush's footsteps and make his primary place of worship Evergreen Chapel, the nondenominational church at Camp David.
Kornblut, Anne E. (February 4, 2010). "Obama's spirituality is largely private, but it's influential, advisers say". The Washington Post. p. A6. Retrieved February 5, 2010. Obama prays privately... And when he takes his family to Camp David on the weekends, a Navy chaplain ministers to them, with the daughters attending a form of Sunday school there.
^Obama 2006a, pp. 202–208 Portions excerpted in: Obama, Barack (October 16, 2006). "My Spiritual Journey". Time. Archived from the original on April 30, 2008. Retrieved April 28, 2008.
^Pulliam, Sarah; Olsen, Ted (January 23, 2008). "Q&A: Barack Obama". Christianity Today. Archived from the original on April 28, 2019. Retrieved January 4, 2013.
^Garrett, Major; Obama, Barack (March 14, 2008). "Obama talks to Major Garrett on 'Hannity & Colmes'". RealClearPolitics. Retrieved November 10, 2012. Major Garrett, Fox News correspondent: So the first question, how long have you been a member in good standing of that church? Sen. Barack Obama (D-IL), presidential candidate: You know, I've been a member since 1991 or '92. And—but I have known Trinity even before then when I was a community organizer on the South Side, helping steel workers find jobs... Garrett: As a member in good standing, were you a regular attendee of Sunday services? Obama: You know, I won't say that I was a perfect attendee. I was regular in spurts, because there was times when, for example, our child had just been born, our first child. And so we didn't go as regularly then.
"Obama strongly denounces former pastor". NBC News. Associated Press. April 29, 2008. Retrieved November 10, 2012. I have been a member of Trinity United Church of Christ since 1992, and have known Reverend Wright for 20 years. The person I saw yesterday was not the person [whom] I met 20 years ago.
Miller, Lisa (July 11, 2008). "Finding his faith". Newsweek. Archived from the original on July 20, 2013. Retrieved November 10, 2012. He is now a Christian, having been baptized in the early 1990s at Trinity United Church of Christ in Chicago.
Remnick 2010, p. 177: "In late October 1987, his third year as an organizer, Obama went with Kellman to a conference on the black church and social justice at the Harvard Divinity School."
Maraniss 2012, p. 557: "It would take time for Obama to join and become fully engaged in Wright's church, a place where he would be baptized and married; that would not happen until later, during his second time around in Chicago, but the process started then, in October 1987... Jerry Kellman: 'He wasn't a member of the church during those first three years, but he was drawn to Jeremiah.'"
"Document". Chicago Tribune. June 27, 1993. p. 9 (Business). Archived from the original on December 4, 2013. Retrieved June 15, 2008.(subscription required)
"Business appointments". Chicago-Sun-Times. July 5, 1993. p. 40. Retrieved June 15, 2008.(subscription required)
Ripley, Amanda (November 3, 2004). "Obama's ascent". Time. Archived from the original on August 11, 2010. Retrieved February 13, 2010.
"About us". Miner, Barnhill & Galland—Chicago, Illinois. 2008. Archived from the original on July 20, 2008. Retrieved June 15, 2008.
Reardon, Patrick T. (June 25, 2008). "Obama's Chicago". Chicago Tribune. p. 1 (Tempo). Retrieved February 13, 2010.
Wolffe, Richard; Briscoe, Daren (July 16, 2007). "Across the Divide". Newsweek. Archived from the original on April 18, 2008. Retrieved April 20, 2008.
^Strausberg, Chinta (September 26, 2002). "Opposition to war mounts". Chicago Defender. p. 1. Archived from the original(paid archive) on May 11, 2011. Retrieved February 3, 2008.
Strausberg, Chinta (October 3, 2002). "War with Iraq undermines U.N". Chicago Defender. p. 1. Archived from the original on October 14, 2009. Retrieved October 28, 2008. Photo caption: Left Photo: Sen. Barack Obama along with Rev. Jesse Jackson spoke to nearly 3,000 anti-war protestors (below) during a rally at Federal Plaza Wednesday.
Bryant, Greg; Vaughn, Jane B. (October 3, 2002). "300 attend rally against Iraq war". Daily Herald. p. 8. Retrieved October 28, 2008.(subscription required)
McCormick, John (October 3, 2007). "Obama marks '02 war speech; Contender highlights his early opposition in an effort to distinguish him from his rivals". Chicago Tribune. p. 7. Archived from the original on December 18, 2008. Retrieved October 28, 2008. The top strategist for Sen. Barack Obama has just 14 seconds of video of what is one of the most pivotal moments of the presidential candidate's political career. The video, obtained from a Chicago TV station, is of Obama's 2002 speech in opposition to the impending Iraq invasion.(subscription required)
Chase, John; Mendell, David (November 3, 2004). "Obama scores a record landslide"(PDF). Chicago Tribune. p. 1. Archived from the original(PDF) on May 13, 2011. Retrieved April 3, 2009.
^ abPearson, Rick; Long, Ray (February 10, 2007). "Obama: I'm running for president". Chicago Tribune. Archived from the original on August 13, 2007. Retrieved September 20, 2008.
Falcone, Michael (December 21, 2007). "Obama's 'One Thing'". The New York Times. Archived from the original on July 16, 2022. Retrieved April 14, 2008.
Baker, Peter; Rutenberg, Jim (June 8, 2008). "The Long Road to a Clinton Exit". The New York Times. Archived from the original on December 9, 2008. Retrieved November 29, 2008.
Stein, Rob; Shear, Michael (January 24, 2009). "Funding restored to groups that perform abortions, other care". The Washington Post. p. A3. Archived from the original on November 11, 2012. Retrieved September 21, 2012. Lifting the Mexico City Policy would not permit U.S. tax dollars to be used for abortions, but it would allow funding to resume to groups that provide other services, including counseling about abortions.
^Block, Robert; Matthews, Mark K. (January 27, 2010). "White House won't fund NASA moon program". Los Angeles Times. Archived from the original on October 26, 2019. Retrieved January 30, 2011. President Obama's budget proposal includes no money for the Ares I and Ares V rocket or Constellation program. Instead, NASA would be asked to monitor climate change and develop a new rocket
^Dyson, Michael Eric (2016). The Black Presidency: Barack Obama and the Politics of Race in America. Houghton Mifflin Harcourt. p. 275. ISBN978-0-544-38766-9.
^Mian, Atif R.; Sufi, Amir (September 1, 2010). "The Effects of Fiscal Stimulus: Evidence from the 2009 'Cash for Clunkers' Program". The Quarterly Journal of Economics. 127 (3): 1107–1142. doi:10.2139/ssrn.1670759. S2CID219352572. SSRN1670759.
^Theodossiou, Eleni; Hipple, Steven F. (2011). "Unemployment Remains High in 2010"(PDF). Monthly Labor Review. 134 (3): 3–22. Archived from the original(PDF) on May 8, 2011. Retrieved April 7, 2011.
^Herszenhorn, David M.; Calmes, Jackie (December 7, 2009). "Abortion Was at Heart of Wrangling". The New York Times. Archived from the original on March 31, 2011. Retrieved December 6, 2009.
^Baker, Peter; Cooper, Helene; Mazzetti, Mark (May 2, 2011). "Bin Laden Is Dead, Obama Says". The New York Times. Archived from the original on May 5, 2011. Retrieved May 3, 2011.
^Holmes, Stephanie (November 30, 2008). "Obama: Oratory and originality". The Age. Melbourne. Archived from the original on December 18, 2008. Retrieved December 11, 2008.
Zlomislic, Diana (December 11, 2008). "New emotion dubbed 'elevation'". Toronto Star. Archived from the original on December 12, 2008. Retrieved December 11, 2008.
^Zelizer, Julian E. (2018). "Policy Revolution without a Political Transformation". In Zelizer, Julian (ed.). The Presidency of Barack Obama: a First Historical Assessment. Princeton University Press. pp. 1–10. ISBN978-0-691-16028-3.
^Eibner, Christine; Nowak, Sarah (2018). The Effect of Eliminating the Individual Mandate Penalty and the Role of Behavioral Factors. Commonwealth Fund (Report). doi:10.26099/SWQZ-5G92.
Koltun, Dave (2005). "The 2004 Illinois Senate Race: Obama Wins Open Seat and Becomes National Political 'Star'". In Ahuja, Sunil; Dewhirst, Robert (eds.). The Road to Congress 2004. Hauppauge, New York: Nova Science Publishers. ISBN978-1-59454-360-9.
Lizza, Ryan (September 2007). "Above the Fray". GQ. Archived from the original on May 14, 2011. Retrieved October 27, 2010.
The New Brownies’ Book: A Love Letter to Black Families – Karida L. Brown and Charly Palmer (2023)
Love & Whiskey: The Remarkable True Story of Jack Daniel, His Master Distiller Nearest Green, and the Improbable Rise of Uncle Nearest – Fawn Weaver (2024)