Do Burns get darker as they heal?

0 votes
asked Jul 4, 2019 in Body/Skin by Djames55 (340 points)
Do Burns get darker as they heal?

1 Answer

0 votes
answered Jul 4, 2019 by Minty (132,850 points)
Yes it's natural and normal for a burn to become darker in color as the burn is healing and the darker skin color as the burn heals can be permanent which is also normal.

The burned area might also become lighter in color instead of dark which is also normal.

Small less severe burns will usually heal without any medical or surgical treatment however deeper and thicker burns always require surgical and medical treatment to fully heal.

Some burns that are severe can even lead to death if not treated professionally and surgery is not performed in time and even then sometimes death can still occur.

If you think your burn is severe enough it's a good idea to go ahead and seek medical treatment by going to the urgent care center or emergency room to have it treated and look after.

101,546 questions

97,166 answers

1,291 comments

7,003,905 users

...