Your understanding isn't correct, at least for nursing.
Nurses were male until the 1800s. https://en.wikipedia.org/wiki/Men_in_nursing#History give examples of predominately nurses and caregivers in other cultures before that time. http://minoritynurse.com/rethinking-gender-stereotypes-in-nu... says "Before modern day nursing, men were nurses, not women. The earliest recorded nursing school was established in India around 250 B.C. It was exclusively for men; women were not allowed to attend because it was believed that women were not as pure as men."
What happened in the 1800s? Quoting from https://files.eric.ed.gov/fulltext/EJ1081399.pdf :
> Through the efforts of Florence Nightingale in the mid-nineteenth century, nursing was established as a women's profession (Hus, Chen & Lou, 2010). Nightingale's image of the nurse as subordinate, nurturing, domestic, humble, and self-sacrificing, as well as not too educated, became prevalent in society. The American Nursing Association ostracized men from nursing until 1930, when as a "result of a bylaw amendment, provision was made for male nurses to become members of the American Nurses' Association" (In Review - American Nurses' Association, p. 6). Looking back in nursing history, Florence Nightingale, and the American Nursing Association ostracized men from the nursing profession.'
See also http://onlinelibrary.wiley.com/doi/10.1046/j.1365-2648.1997.... , "History appears to indicate that men have had a place in nursing for as long as records are available, but their contribution has been perceived as negligible, largely because of the dominant influence that the 19th century female nursing movement has had on the occupation's historical ideology."