Page 121 - My FlipBook
P. 121
Brochure 2020

challenging to detect SNMDs because mental status namely SW-B+tree. Both of these systems have been
cannot be observed directly from online social activity evaluated by a series of experiments and the results
logs. Our approach, novel and innovative with regard have demonstrated the e ectiveness of the designs.
to SNMD detection, does not rely on self-revealing of
those mental factors via questionnaires. Instead, we IV. Composite Neural Network: Theory and Its Application
are proposing a machine learning framework, namely to PM2.5 Prediction
Social Network Mental Disorder Detection (SNMDD),
with a new SNMD-based Tensor Model (STM). We hope This work investigates the framework and performance
to assist mental healthcare professionals to alleviate issues of composite neural networks, which are
the symptoms of users with Social Network Addictions composed of a collection of pre-trained and non-
(SNAs) in the early stages by means of behavioral instantiated neural network models that are connected
therapy. We are proposing a novel framework, called as a rooted directed acyclic graph to solve complex
Newsfeed Substituting and Supporting System (N3S), problems. A pre-trained neural network model is
for newsfeed filtering and dissemination in support generally well trained and targeted to approximate
of SNA interventions. We first propose the Additive a specific function. Despite a general belief that a
Degree Model (ADM) to measure addictiveness of composite neural network may perform better than a
newsfeeds for different users. We can then formulate single component, overall performance characteristics
a new optimization problem aimed at maximizing the are not clear. In this work, we have constructed the
efficacy of behavioral therapy without sacrificing user framework of a composite network and have shown
preferences. Accordingly, we will design a randomized that it performs better than any of its pre-trained
algorithm with a theoretical bound. The results of large- components with a high probability bound. In addition, if
scale user studies and experiments are demonstrating an extra pre-trained component is added to a composite
the e cacy of both SNMDD and N3S. network, overall performance is typically not degraded.
In our study, we are exploring a complex application, i.e.,
III. Efficient Data Management for Large-scale Computer PM2.5 prediction, to validate composite network theory.
Systems with Write-constrained Memory and Storage In our empirical evaluations of PM2.5 prediction, our
Devices composite neural network models support the proposed
theory and perform better than other machine learning
In recent years, digital data volumes have rapidly been models, demonstrating the advantages of our proposed
growing larger due to emerging big data applications. framework.
Current memory and storage capacities cannot balance
cost and performance to meet the requirements for
data storage. Some novel memory technologies have
been developed for large-scale data applications. We are
considering a novel computer architecture that applies:
(1) non-volatile random access memory (NVRAM) as
the main memory for reducing power consumption of
the system; and (2) shingled magnetic recording (SMR)
drives as storage to enhance capacity. However, both
of these new devices have the write constraint issue,
and they may significantly affect performance and
endurance. Therefore, to efficiently manage massive
data over such architectures, we are focusing on index
management and have proposed methodologies for
mitigating the overheads of two different levels of the
memory hierarchy. First, to accelerate the index search
operation, we have proposed an NVRAM-friendly sorting
algorithm called B*-sort. B*-sort not only enhances the
performance of the sorting algorithm on NVRAM, but
also maximizes lifetime while conducting the sorting
algorithm. To mitigate the overhead for managing
indexes on SMR drives, we have also developed a
sequential-write-constrained B+tree index scheme,

119
   116   117   118   119   120   121   122   123   124   125   126