# The Sleep Heart Health Study (SHHS) is a comprehensive landmark study

The Sleep Heart Health Study (SHHS) is a comprehensive landmark study of sleep and its impacts on health outcomes. , ? 1, where is the imaginary unit. If denotes a range of frequencies, then the F3 charged power of CEP-18770 the signal in the frequency range is defined as [0.8C4.0 Hz]; (2) [4.1C8.0 Hz]; (3) [8.1C13.0 Hz]; (4) [13.1C20.0 Hz]. These bands are standard representations of low (power. However, to make power comparable across subjects, we normalized it as NP= + + + -power is thought to be less dependent on the amplitude of the signal, which can be influenced by potential drift that may occur over the course of the full night, with unattended monitoring in the home setting particularly. Because of the nonstationary nature of the EEG signal, the normalization and DFT were applied in adjacent 30-second intervals, resulting in the temporal representation: NPindicates the mid-point of the corresponding 30-second interval. To better understand the data structure, Figure 1 displays the fraction of -power for three subjects at two visits in the SHHS. The dots represent pairs { [0, 1], be a squared integrable random function with mean are uncorrelated random variables with mean zero and variance (for observation within cluster for = 1, 2, , and = 1, 2 , is time from sleep onset, is subject, and is visit. Without loss of generality, we restrict attention to the case when each subject is measured for every value of ((is the residual subject- and visit-specific deviation from the subject-specific mean. CEP-18770 In the SHHS ((and are level 1 and level 2 principal component scores respectively, and and are level 1 and level 2 eigenfunctions, respectively. Model (2.1) with the KL expansions (2.2) becomes and are zero mean random variables. At a first glance, model (2.3) may appear too complex to be implemented for studies with large sample size, such as the SHHS. However, we show that inference from this model can be done using a short sequence of simple steps. We summarize the core assumptions as follows: (A.1) : = 1, 2, are uncorrelated with : = 1, 2, . Assumptions (A.1)C(A.4) are standard for functional principal component analysis, and (A.5) corresponds to the previously stated assumption that (and (((are used to refer to the total, between, and within subject covariances, respectively. These, of course, are not the same quantities as in mixed ANOVA models, but our notation builds upon the intuitive variance decomposition of these simpler models. The within and between decomposition of variability based on the KL expansion leads to the following convenient algorithm: 1 estimate the mean and covariance function, ((((((2 use eigenanalysis on (3 use eigenanalysis on (4 estimate principal component scores (technical details in Section 3). In practice, each function, (: = 1, 2, , = and = of the algorithm is easy to construct. More precisely, (((((((((((((((((? 1). The method of moments estimators can be constructed in a variety of other situations. If the sampling points are dense for each subject/visit reasonably, then data can be smoothed first and the mean predicted on an equally spaced grid of points. A different case occurs when data for each subject/visit is sparse, but the collection of sampling points over visits and subjects is dense. A reasonable approach in this full case would be to consider the histogram of all sampling points, (=1, , = 1, , = 1, , ((((1 is estimated as a difference, it might not be positive definite. This problem can be solved by trimming eigenvalue-eigenvector pairs where the eigenvalue is negative [Hall, Mller and Yao (2008); Mller (2005), Yao, Mller and Wang (2005)]. As shown in Hall, Mller and Yao (2008), this method is more accurate CEP-18770 than the method of moments in terms of the is the number of grid points. We used a similar method for choosing the true number of components at level 2. These choices were conservative slightly, but worked well in our application and simulations. However, the two thresholds should be tuned in any other particular application using simulations carefully. An important parameter is the proportion of variability.