SIFT and Wake Forest School of Medicine propose Characterizing Human Activities for Cancer Health Awareness (CHA-CHA) to recognize activities specifically selected by clinical Oncologists using Artificial Intelligence approaches that translate the patient's video data into symbolic information that humans can interpret. This enables physicians to receive reports on which activities are recognized, why they are recognized, and how the patient performed them. More importantly, this is all possible regardless of the physical distance between the physician and their cancer patient. CHA-CHA provides: accurate and interpretable patient activity, behavior, and performance recognition, automatically segmented activities in video, an iterative development process between SIFT and Wake Forest ensuring activities and performance parameters are directly relevant to the cancer health domain, inherent security through encrypted communication and ion of patient information, and experimental validation of accuracy compared to a gold standard. We use state of the art AI that reasons on symbolic information that, as opposed to black box algorithms, enables the physician to directly provide insights to refine recognition. CHA-CHA allows patients to live their lives at home and unburdened while physicians gain access to activity and performance records key to monitoring their health state between visits. We use state of the art AI that reasons on symbolic information that, as opposed to black box algorithms, enables the physician to directly provide insights to refine recognition. CHA-CHA allows patients to live their lives at home and unburdened while physicians gain access to activity and performance records key to monitoring their health state between visits.