Permutation Entropy
Measures the temporal complexity of time series by analyzing ordinal patterns.
PE = -∑ p(πᵢ) × log(p(πᵢ))
def permutation_entropy(signal, m=5, tau=2):
patterns = ordinal_patterns(signal, m, tau)
probs = pattern_probabilities(patterns)
return -sum(p * np.log(p) for p in probs if p > 0)
Jensen-Shannon Complexity
Normalized measure combining entropy with equilibrium distribution distance.
C = H[P] × JS[P,Pₑ] / H_max
def js_complexity(signal, m=5, tau=2):
pe = permutation_entropy(signal, m, tau)
js_div = jensen_shannon_divergence(signal, m, tau)
return pe * js_div / np.log(np.math.factorial(m))
Statistical Validation
F-test ANOVA with effect size analysis for feature significance testing.
F = MS_between / MS_within
η² = SS_between / SS_total
from scipy.stats import f_oneway
f_stat, p_value = f_oneway(*activity_groups)
eta_squared = ss_between / ss_total
Vertical Axis Discovery
Novel finding: 27.8% of discriminative information comes from vertical processing.
Dominance_ratio = ∑F_vertical / ∑F_total
vertical_features = ['PE_z', 'Complexity_z', 'Vertical_Dominance']
vertical_importance = sum(f_stats[f] for f in vertical_features)
dominance_ratio = vertical_importance / total_importance