
Scars are a sign of an external wound that has healed. The wound itself can be caused by an accident, surgery, disease, acne, or sharp objects being scratched. To prevent and reduce the appearance of scars, it is recommended to treat the wound well until the wound is fully healed.
Naturally, the...