Adding some clarity to the vitamin D debate: all things in moderation

The debate regarding the benefits of sun-exposure with respect to vitamin D production, versus the risk of skin cancer has been prolonged and confusing. An increasing list of the virtues of a healthy vitamin D level to our bodies is evolving. Against this; the production of vitamin D relies on the exposure of our skin to ultraviolet light (UV) which is well known as a human carcinogen. Mixed messages are sent, leaving the public unsure and distrusting of the advice received. One minute we are told to cover-up. The next report tells us there is a vitamin D deficiency epidemic and we must all get more sun-exposure to help our bones, our immune system, and prevent cancer. While media reports favour dramatic and sweeping statements, it is difficult to give general advice when each person's genetic and environmental make-up is so varied. Dark-skinned people living in areas of low UV-exposure are far more at risk of vitamin D deficiency than skin cancer. The opposite is true for fair-skinned people living in a high UV environment.

A 2014 study in fair-skinned Danish people proved that exposure to high UV levels while on holiday in the Canary Islands unfortunately not only led to an increase in vitamin D levels, but also to a significant increase in known biological markers of skin cell DNA damage. So what do we do? We need sun-exposure to make vitamin D, but as soon as we try to get it, we induce skin cancer??

As with most debates, a degree of moderation often ends up the best conclusion. With this in mind, a recent study performed by Felton et al. was reported in the British Journal of Dermatology. Our natural skin pigmentation level is measured by the Fitzpatrick Skin Type Scale. Fitzpatrick skin type I through to type VI document extremely fair skin that always burns and never tans (pale peach, blond or red hair, blue eyes, freckles) through to very dark skin that never burns, and always tans.

In the latest study, Felton et al. compared the results of people of Type V skin (dark brown) to Type II skin (fair) to simulated UK-latitude sun-exposure. The UV-exposure was equivalent to 13-17 minutes of exposure to the UK June midday sun, six times per week for a six-week period. They confirmed what we know: Type II skin demonstrated more DNA damage, and Type V skin less serum vitamin D production following sun-exposure.

However, interestingly they also found that mechanisms within our skin are present to respond to gentle sun-exposure by repairing sun-damaged DNA, and therefore reducing the risk of UVB-induced skin cancer. 24-hours after the six-week exposure period, the urine levels of CPD (a surrogate marker of skin DNA damage) and another DNA-damage marker (8-oxo-dG) were undetectable in both groups. This strongly suggests that fair skin can adapt to gentle sun-exposure, protecting itself against damage while allowing the production of vitamin D.

Moderation

The important difference between the Danish study results and Felton's study is the level of UV exposure. The Danish study involved high-intensity UV levels. Felton's study involved longer, lower level exposures. Clearly, the summer-time UV levels in New Zealand and Australia are intense, and 13-17 minutes in the midday sun is likely to yield very different and detrimental results. However, gentle low-level exposure (e.g early mornings and late evenings, non-summertime etc) may mean we can get the best of both worlds. This also sits nicely with advice to use sunscreen during times of high exposure. Sunscreen filters out a proportion of UV radiation (rather than completely blocking it) reducing the dose reaching your skin and therefore should help achieve the effect seen in Felton's study rather than that seen in the Danish study. A number of studies have shown that the use of sunscreen does not reduce vitamin D levels in subjects exposed to the sun. Again, generalised advice is difficult. We need to take into consideration our skin type, our individual risk-factors for skin cancer (past exposure, family history etc), and our geographic location to strike a balance that will work well for our own situation.

We have known from previous studies that the high-intensity UV-exposure is not needed for efficient vitamin D production. In fact, vitamin D begins to be degraded at high UV exposure levels. While the relationship between increasing UV exposure and DNA damage is linear, the relationship between UV exposure and vitamin D production is non-linear. Maximum vitamin D production is seen at around 1/3 of the UV dose that induces a slight sunburn. Clearly, the advice never to allow yourself to get sun-burnt stands strong. The body's homeostatic and protective mechanisms are incredible, but can be overcome but extreme exposure (common during Australasian summertime sunlight exposure). However, we don't need to completely avoid all sun-exposure and risk vitamin D deficiency either. The benefits of sunlight (independent of vitamin D) as well as regular outdoor physical activity are well known. This study helps us to understand that with moderation, the clever machine that is our body, may well be able to manage the balance between optimal vitamin D level and skin cancer risk.

A word of caution however, Felton's study was performed on subjects aged 23-59 years old. Whether the same results would be found in older subjects is unknown. Prior studies demonstrate older skin lacks the same ability to repair skin DNA damage due to reduced IGF-1 production by the skins fibroblasts. Interestingly, anti-ageing resurfacing procedures such as micro-needling, dermabrasion, and lasers appear not only to improve the skin's appearance, but to rejuvenate the dermal fibroblasts protective IGF-1 production. Studies on this topic are ongoing to determine if skin resurfacing procedures do indeed produce a lasting preventative effect against skin cancer - rejuvenating not only the skin's appearance, but it's function as well.