Note
This page was generated from a Jupyter Notebook. Download the notebook (.ipynb)
gwexpy TimeSeries New Features Tutorial
This notebook introduces the new methods added to the TimeSeries class in gwexpy, an extension of gwpy, and explains how to use them.
gwexpy maintains high compatibility with GWpy while significantly enhancing signal processing, statistical analysis, and interoperability with other libraries.
Table of Contents
Environment Setup
Signal Processing and Demodulation (Hilbert, Phase, Demodulation)
Spectral Analysis and Correlation (FFT, Transfer Function, xcorr)
Hilbert-Huang Transform (HHT)
Statistics and Preprocessing (Impute, Standardize, ARIMA, Hurst, Rolling)
Resampling and Reindexing (asfreq, resample)
Function Fitting (fit)
Interoperability (Pandas, Xarray, Torch, and more)
1. Environment Setup
First, we import the necessary libraries and generate sample data for demonstration.
[1]:
import matplotlib.pyplot as plt
import numpy as np
from astropy import units as u
from gwexpy.noise.wave import chirp, exponential, gaussian, sine
from gwexpy.plot import Plot
from gwexpy.timeseries import TimeSeries
# Generate sample data (5 seconds of data sampled at 100Hz)
fs = 100
duration = 5.0
# Restore 't' for compatibility with downstream cells if they use it
t = np.arange(0, duration, 1 / fs)
# Sensor 1: 10Hz sine wave + noise
s1 = sine(duration=duration, sample_rate=fs, frequency=10, amplitude=1.0)
n1 = gaussian(duration=duration, sample_rate=fs, std=0.2)
ts1 = s1 + n1
ts1.name = "Sensor 1"
ts1.override_unit("V")
# Sensor 2: Chirp signal (frequency changes over time: 5Hz -> 25Hz) + exponential amplification
s2 = chirp(duration=duration, sample_rate=fs, f0=5, f1=25, t1=duration)
env = exponential(
duration=duration, sample_rate=fs, tau=2.0, decay=False, amplitude=0.2
)
ts2 = s2 * env
ts2.name = "Chirp Signal"
ts2.override_unit("V")
2. Signal Processing and Demodulation
gwexpy integrates Hilbert transforms, envelope calculations, instantaneous frequency computation, and demodulation functions like lock-in amplifiers.
Hilbert Transform and Envelope
Use hilbert and envelope.
[2]:
# Calculate the analytic signal
ts_analytic = ts2.hilbert()
# Calculate the envelope
ts_env = ts2.envelope()
plot = Plot(ts2, ts_env, figsize=(10, 4))
ax = plot.gca()
ax.get_lines()[0].set_label("Original (Chirp)")
ax.get_lines()[0].set_alpha(0.5)
ax.get_lines()[1].set_label("Envelope")
ax.get_lines()[1].set_color("red")
ax.get_lines()[1].set_linewidth(2)
ax.legend()
ax.set_title("Hilbert Transform and Envelope")
plt.show()
Instantaneous Phase and Instantaneous Frequency
Use instantaneous_phase and instantaneous_frequency. You can specify phase unwrapping (unwrap) and degree units (deg).
[3]:
# Instantaneous phase (unwrap=True to remove phase jumps)
phase_rad = ts2.instantaneous_phase(unwrap=True)
phase_deg = ts2.instantaneous_phase(deg=True, unwrap=True)
# Instantaneous frequency
freq = ts2.instantaneous_frequency()
plot = Plot(phase_rad, freq, separate=True, sharex=True, figsize=(10, 6))
ax = plot.axes
ax[0].set_ylabel("Phase [rad]")
ax[0].set_title("Instantaneous Phase (Unwrapped)")
ax[1].set_ylabel("Frequency [Hz]")
ax[1].set_title("Instantaneous Frequency")
plt.show()
Mixing and Demodulation (Mix-down, Baseband, Lock-in)
Functions for extracting specific frequency components or bringing them down to baseband.
[4]:
# 1. mix_down: Demodulate at a specific frequency f0 (becomes a complex signal)
ts_mixed = ts1.mix_down(f0=10)
# 2. baseband: Execute demodulation + lowpass + resampling in one step
ts_base = ts1.baseband(f0=10, lowpass=5, output_rate=20)
# 3. lock_in: Lock-in detection (extract amplitude and phase)
amp, ph = ts1.lock_in(f0=10, stride=0.1) # Output average every 0.1 seconds
res_complex = ts1.lock_in(
f0=10, stride=0.1, output="complex"
) # Output as complex numbers
print(amp)
plot = Plot(amp, ph, separate=True, sharex=True, figsize=(10, 6))
ax = plot.axes
ax[0].get_lines()[0].set_label("Amplitude")
ax[0].axhline(1.0, color="gray", linestyle="--", label="Theoretical (Sine amp=1)")
ax[0].set_ylabel("Amplitude")
ax[0].legend()
ax[1].get_lines()[0].set_color("orange")
ax[1].get_lines()[0].set_label("Phase [deg]")
ax[1].set_ylabel("Phase [deg]")
plot.figure.suptitle("Lock-in Amplification Result at 10Hz")
plt.show()
TimeSeries([1.02741597, 1.16552899, 0.93765881, 0.87731236,
1.02777865, 0.91539637, 1.01821019, 0.90240575,
1.0111066 , 0.9914134 , 1.00146936, 1.12517907,
0.91267559, 0.95556561, 1.04049211, 1.00646786,
1.14199336, 1.04945363, 1.03907644, 0.96127339,
0.92151929, 1.07825152, 0.95047776, 0.97586436,
1.0151117 , 1.06991705, 1.02617825, 1.01241226,
1.01888422, 1.11207074, 1.07279902, 1.00769159,
1.08451442, 0.89048937, 1.16849788, 0.9963111 ,
0.9457141 , 0.81001496, 0.94112349, 1.11907093,
1.0364996 , 0.9199482 , 1.22395555, 1.1179697 ,
0.95298715, 1.12631136, 0.95882988, 0.85918777,
1.03224085, 1.17684784],
unit: V,
t0: 0.0 s,
dt: 0.1 s,
name: Sensor 1,
channel: None)
3. Spectral Analysis and CorrelationWhile inheriting GWpy’s functionality, mode="transient" FFT for transient signals and transfer function calculation via direct FFT ratio have been added.
Extended FFTUsing mode="transient" enables zero-padding, adjustment to fast lengths, and individual left/right padding specifications.
[5]:
# Standard FFT (GWpy compatible)
fs_gwpy = ts1.fft()
# Transient mode:
# Specify padding amount with pad_left/right. Use nfft_mode="next_fast_len" to adjust to computationally efficient length.
fs_trans = ts1.fft(
mode="transient", pad_left=0.5, pad_right=0.5, nfft_mode="next_fast_len"
)
plot = Plot(fs_gwpy.abs(), fs_trans.abs(), yscale="log", xlim=(0, 50), figsize=(10, 4))
ax = plot.gca()
ax.get_lines()[0].set_label("Standard FFT")
ax.get_lines()[1].set_label("Transient FFT (Padded)")
ax.get_lines()[1].set_alpha(0.7)
ax.legend()
ax.set_title("Comparison of FFT modes")
plt.show()
Transfer Function and Cross-Correlation (xcorr)With transfer_function, you can choose not only the Welch method (mode="steady") but also direct FFT ratio (mode="transient").
[6]:
# Calculate transfer function (transfer function of Chirp Signal with respect to Sensor 1)
tf_welch = ts2.transfer_function(ts1, fftlength=1)
tf_fft = ts2.transfer_function(
ts1, mode="transient"
) # Full-span FFT ratio (useful for transient response analysis)
# Cross-correlation (xcorr)
corr = ts1.xcorr(ts2, maxlag=0.5, normalize="coeff")
plot = Plot(figsize=(10, 6))
ax1 = plot.add_subplot(2, 1, 1)
ax1.semilogy(tf_welch.frequencies, np.abs(tf_welch), label="Welch method")
ax1.semilogy(tf_fft.frequencies, np.abs(tf_fft), label="Direct FFT ratio", alpha=0.5)
ax1.set_title("Transfer Function Magnitude")
ax1.legend()
ax2 = plot.add_subplot(2, 1, 2)
lag = corr.times.value - corr.t0.value
ax2.plot(lag, corr)
ax2.set_title("Cross-Correlation (normalized)")
ax2.set_xlabel("Lag [s]")
plt.show()
STLT (Short-Time Laplace Transform)STLT (Short-Time Laplace Transform) is a transform for extracting local structures that change over time in a signal.In gwexpy, the STLT result is a 3D transform with axes (time × sigma × frequency), represented by TimePlaneTransform / LaplaceGram.Below is an example of calculating STLT using the stlt method and extracting a slice (Plane2D) at a specific time.This example also demonstrates specifying frequencies (Hz) to evaluate STLT at arbitrary frequency points.
[7]:
# Data preparation (for demonstration)
import numpy as np
from gwexpy.plot import Plot
t = np.linspace(0, 10, 1000)
data = TimeSeries(np.sin(2 * np.pi * 1 * t), times=t * u.s, unit="V", name="Demo Data")
# STLT
# stride: time step, window: analysis window length
freqs = np.array([0.5, 1.0, 1.5]) # Hz
stlt_result = data.stlt(stride="0.5s", window="2s", frequencies=freqs)
print(f"Kind: {stlt_result.kind}")
print(f"Shape: {stlt_result.shape} (Time x Sigma x Frequency)")
print(f"Time Axis: {len(stlt_result.times)} steps")
print(f"Sigma Axis: {len(stlt_result.axis1.index)} bins")
print(f"Frequency Axis: {len(stlt_result.axis2.index)} bins")
# Extract plane at specific time (t=5.0s)
plane_at_5s = stlt_result.at_time(5.0 * u.s)
print(f"Plane at 5.0s shape: {plane_at_5s.shape}")
# Plane2D Confirm behavior as Plane2D
print(f"Axis 1: {plane_at_5s.axis1.name}")
print(f"Axis 2: {plane_at_5s.axis2.name}")
Kind: stlt
Shape: (17, 1, 3) (Time x Sigma x Frequency)
Time Axis: 17 steps
Sigma Axis: 1 bins
Frequency Axis: 3 bins
Plane at 5.0s shape: (1, 3)
Axis 1: sigma
Axis 2: frequency
4. Hilbert-Huang Transform (HHT)Hilbert-Huang Transform (HHT) functionality for nonlinear and non-stationary signal analysis.Combines Empirical Mode Decomposition (EMD) and Hilbert spectral analysis.
[8]:
# HHT (Hilbert-Huang Transform)
# Execute Empirical Mode Decomposition (EMD) and extract IMFs (Intrinsic Mode Functions)
# Note: Requires PyEMD (EMD-signal) : `pip install EMD-signal`
try:
import os
import warnings
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
warnings.filterwarnings("ignore", category=UserWarning, module="google.protobuf")
warnings.filterwarnings("ignore", message="Protobuf gencode version")
# EMDExecute EMD (returns dictionary)
# method="emd" (standard EMD) or "eemd" (Ensemble EMD)
# Here we use ts2 (example with chirp signal)
imfs = ts2.emd(method="emd", max_imf=3)
print(f"Extracted IMFs: {list(imfs.keys())}")
# Plot IMFs
sorted_keys = sorted(
[k for k in imfs.keys() if k.startswith("IMF")], key=lambda x: int(x[3:])
)
if "residual" in imfs:
sorted_keys.append("residual")
# gwexpy.plot.Plot for batch plotting
plot_data = [ts2] + [imfs[k] for k in sorted_keys]
plot = Plot(*plot_data, separate=True, sharex=True, figsize=(10, 8))
# Original settings
ax0 = plot.axes[0]
ax0.get_lines()[0].set_label("Original")
ax0.get_lines()[0].set_color("black")
ax0.legend(loc="upper right")
ax0.set_title("Empirical Mode Decomposition")
# IMF settings
for i, key in enumerate(sorted_keys):
ax = plot.axes[i + 1]
ax.get_lines()[0].set_label(key)
ax.legend(loc="upper right")
plt.show()
except ImportError:
print("EMD-signal not installed. Skipping HHT demo.")
except Exception as e:
print(f"HHT Error: {e}")
Extracted IMFs: ['IMF1', 'IMF2', 'IMF3', 'residual']
5. Statistics and PreprocessingMissing value imputation, standardization, ARIMA models, Hurst exponent, and rolling statistics are available as TimeSeries methods.
[9]:
# Test data with missing values
ts_nan = ts1.copy()
ts_nan.value[100:150] = np.nan
# 1. impute: Missing value imputation (interpolation, etc.)
ts_imputed = ts_nan.impute(method="interpolate")
# 2. standardize: Standardization (z-score, robust, etc.)
ts_z = ts1.standardize(method="zscore")
ts_robust = ts1.standardize(method="zscore", robust=True) # Use Median/IQR
plot = Plot(ts_nan, ts_imputed, figsize=(10, 4))
ax = plot.gca()
ax.get_lines()[0].set_label("with NaNs")
ax.get_lines()[0].set_color("red")
ax.get_lines()[0].set_alpha(0.3)
ax.get_lines()[1].set_label("Imputed")
ax.get_lines()[1].set_linestyle("--")
ax.legend()
ax.set_title("Missing Value Imputation")
plt.show()
Peak Detection (Find Peaks)Wraps scipy.signal.find_peaks to detect peaks in time series data.
[10]:
# Peak Detection (Find Peaks)
# height, threshold, distance, prominence, width such asCan specify parameters such as
# Can also specify thresholds with units
# ts2 (Chirp + Sine) Find peaks in
peaks, props = ts2.find_peaks(height=0.0, distance=50)
print(f"Found {len(peaks)} peaks")
if len(peaks) > 0:
print("First 5 peaks:", peaks[:5])
# Plot
plot = ts2.plot(figsize=(12, 4))
ax = plot.gca()
ax.scatter(
peaks.times.value,
peaks.value,
marker="x",
color="red",
s=100,
label="Peaks",
zorder=10,
)
ax.legend(loc="upper right")
ax.set_title("Peak Detection Result")
plt.show()
Found 9 peaks
First 5 peaks: TimeSeries([0.25680508, 0.32974425, 0.44255219, 0.58570943,
0.81881855],
unit: V,
t0: 0.5 s,
dt: 0.5 s,
name: Chirp Signal_peaks,
channel: None)
ARIMA Model and Hurst ExponentNote: These features require libraries such as statsmodels and hurst.
[11]:
try:
import os
import warnings
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
warnings.filterwarnings("ignore", category=UserWarning, module="google.protobuf")
warnings.filterwarnings("ignore", message="Protobuf gencode version")
# 3. fit_arima: ARIMA(1,0,0) fitting and forecasting
model = ts1.fit_arima(order=(1, 0, 0))
resid = model.residuals()
forecast, conf = model.forecast(steps=30)
plot = Plot(ts1.tail(100), forecast, figsize=(10, 4))
ax = plot.gca()
ax.get_lines()[0].set_label("Measured")
ax.get_lines()[1].set_label("Forecast")
ax.get_lines()[1].set_color("orange")
# Fill between
ax.fill_between(
conf["lower"].times.value,
conf["lower"].value,
conf["upper"].value,
alpha=0.2,
color="orange",
)
ax.set_title("ARIMA(1,0,0) Fit & Forecast")
ax.legend()
plt.show()
except Exception as e:
print(f"ARIMA skipping: {e}")
try:
import os
import warnings
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
warnings.filterwarnings("ignore", category=UserWarning, module="google.protobuf")
warnings.filterwarnings("ignore", message="Protobuf gencode version")
# 4. hurst / local_hurst: Hurst exponent (indicator of long-range correlation)
h_val = ts1.hurst()
h_detail = ts1.hurst(return_details=True) # With detailed information
h_local = ts1.local_hurst(window=1.0) # Evolution with 1-second window
plot = Plot(h_local, figsize=(10, 4))
ax = plot.gca()
ax.get_lines()[0].set_label("Local Hurst")
ax.axhline(h_val, color="red", linestyle="--", label=f"Global H={h_val:.2f}")
ax.set_ylim(0, 0.2)
ax.set_title("Hurst Exponent Analysis")
ax.legend()
plt.show()
except Exception as e:
print(f"Hurst skipping: {e}")
Rolling StatisticsSimilar to Pandas, rolling_mean, std, median, min, max are available.
[12]:
rw = 0.5 * u.s # 0.5 second window
rmean = ts1.rolling_mean(rw)
rstd = ts1.rolling_std(rw)
rmed = ts1.rolling_median(rw)
rmin = ts1.rolling_min(rw)
rmax = ts1.rolling_max(rw)
plot = Plot(ts1, rmean, rmed, figsize=(10, 4))
ax = plot.gca()
ax.get_lines()[0].set_label("Original")
ax.get_lines()[0].set_alpha(0.3)
ax.get_lines()[1].set_label("Rolling Mean")
ax.get_lines()[1].set_color("blue")
ax.get_lines()[2].set_label("Rolling Median")
ax.get_lines()[2].set_color("green")
ax.get_lines()[2].set_linestyle("--")
ax.fill_between(
rmean.times.value,
rmean.value - rstd.value,
rmean.value + rstd.value,
alpha=0.1,
color="blue",
)
ax.legend()
ax.set_title("Rolling Statistics (Window=0.5s)")
plt.show()
6. Resampling and ReindexingThe asfreq method allows assignment to a fixed grid, and the resample method enables “time bin aggregation”.
[13]:
# 1. asfreq: Reindex with Pandas-like naming conventions ('50ms' など)
ts_reindexed = ts1.asfreq("50ms", method="pad")
Plot(ts1, ts_reindexed)
plt.legend(["Original", ".asfreq('50ms', method='pad')"])
[13]:
<matplotlib.legend.Legend at 0x7f888ca06210>
[14]:
# 2. resample:
# If you specify a number (10Hz) , signal processing resampling (GWpy standard)
ts_sig = ts1.resample(10)
# If you specify a string ('200ms') , statistics for each time bin (new feature)
ts_binned = ts1.resample("200ms") # Default is mean
# Plot doesn't have a direct 'step' method equivalent in args, so we use ax.step or pass to plot
# However, kwarg 'drawstyle'='steps-post' works in plot()? gwpy Plot wraps matplotlib.
# Let's assume we can modify axes after creation or use standard plot for simplicity if appropriate, but step is specific.
# Better strategy: Create Plot instance, then use ax.step
plot = Plot(figsize=(10, 4))
ax = plot.gca()
ax.plot(ts1, alpha=0.6, label="Original (100Hz)")
ax.step(
ts_binned.times,
ts_binned.value,
where="post",
label="Binned Mean (200ms)",
linewidth=2,
)
ax.legend()
ax.set_title("Resampling: Signal Resampling vs Time Binning")
plt.show()
7. Function Fittinggwexpy provides powerful fitting functionality based on iminuit. To avoid polluting GWpy’s original classes, the .fit() method is opt-in.
[15]:
from gwexpy.fitting.models import damped_oscillation
# Fit with damped oscillation model (pass function directly)
# Initial values: A=0.5, tau=0.5, f=15, phi=0
result = data.fit(damped_oscillation, A=0.5, tau=0.5, f=15, phi=0)
# Display result (iminuit format)
print(result)
# Get best-fit curve
# Note: x_data is the time array corresponding to data points
x_data = data.times.value
best_fit = result.model(x_data)
# Plot
plot = data.plot(label="Noisy Signal")
ax = plot.gca()
ax.plot(data.times, best_fit, label="Best Fit", color="red", linestyle="--")
ax.legend()
ax.set_title("Damped Oscillation Fit")
plt.show()
┌─────────────────────────────────────────────────────────────────────────┐
│ Migrad │
├──────────────────────────────────┬──────────────────────────────────────┤
│ FCN = 0.0005703 (χ²/ndof = 0.0) │ Nfcn = 324 │
│ EDM = 0.000176 (Goal: 0.0002) │ │
├──────────────────────────────────┼──────────────────────────────────────┤
│ Valid Minimum │ Below EDM threshold (goal x 10) │
├──────────────────────────────────┼──────────────────────────────────────┤
│ No parameters at limit │ Below call limit │
├──────────────────────────────────┼──────────────────────────────────────┤
│ Hesse ok │ Covariance accurate │
└──────────────────────────────────┴──────────────────────────────────────┘
┌───┬──────┬───────────┬───────────┬────────────┬────────────┬─────────┬─────────┬───────┐
│ │ Name │ Value │ Hesse Err │ Minos Err- │ Minos Err+ │ Limit- │ Limit+ │ Fixed │
├───┼──────┼───────────┼───────────┼────────────┼────────────┼─────────┼─────────┼───────┤
│ 0 │ A │ 1.00 │ 0.06 │ │ │ │ │ │
│ 1 │ tau │ 0 │ 0.06e6 │ │ │ │ │ │
│ 2 │ f │ 1.0000 │ 0.0025 │ │ │ │ │ │
│ 3 │ phi │ -0.00 │ 0.09 │ │ │ │ │ │
└───┴──────┴───────────┴───────────┴────────────┴────────────┴─────────┴─────────┴───────┘
┌─────┬─────────────────────────────────────────────────────┐
│ │ A tau f phi │
├─────┼─────────────────────────────────────────────────────┤
│ A │ 0.00354 -2.4033831e3 3e-6 -0.0001 │
│ tau │ -2.4033831e3 3.78e+09 -8.714e-3 38.287 │
│ f │ 3e-6 -8.714e-3 6.05e-06 -190e-6 │
│ phi │ -0.0001 38.287 -190e-6 0.00796 │
└─────┴─────────────────────────────────────────────────────┘
8. InteroperabilityInterconversion with major data science and machine learning libraries is very smooth.
[16]:
# Pandas & Xarray
try:
import os
import warnings
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
warnings.filterwarnings("ignore", category=UserWarning, module="google.protobuf")
warnings.filterwarnings("ignore", message="Protobuf gencode version")
df = ts1.to_pandas(index="datetime")
ts_p = TimeSeries.from_pandas(df)
print("Pandas interop OK")
display(df)
except ImportError:
pass
try:
import os
import warnings
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
warnings.filterwarnings("ignore", category=UserWarning, module="google.protobuf")
warnings.filterwarnings("ignore", message="Protobuf gencode version")
xr = ts1.to_xarray()
ts_xr = TimeSeries.from_xarray(xr)
print("Xarray interop OK")
print(xr)
except ImportError:
pass
Pandas interop OK
time_utc
1980-01-06 00:00:19+00:00 0.046388
1980-01-06 00:00:19.010000+00:00 0.647492
1980-01-06 00:00:19.020000+00:00 1.065642
1980-01-06 00:00:19.030000+00:00 0.965746
1980-01-06 00:00:19.040000+00:00 0.381562
...
1980-01-06 00:00:23.950000+00:00 -0.044864
1980-01-06 00:00:23.960000+00:00 -1.078486
1980-01-06 00:00:23.970000+00:00 -1.070583
1980-01-06 00:00:23.980000+00:00 -1.417911
1980-01-06 00:00:23.990000+00:00 -0.814040
Name: Sensor 1, Length: 500, dtype: float64
Xarray interop OK
<xarray.DataArray 'Sensor 1' (time: 500)> Size: 4kB
array([ 0.04638754, 0.64749198, 1.065642 , ..., -1.07058346,
-1.41791072, -0.81403963], shape=(500,))
Coordinates:
* time (time) datetime64[us] 4kB 1980-01-06T00:00:19 ... 1980-01-06T00:...
Attributes:
unit: V
name: Sensor 1
channel: None
epoch: 0.0
time_coord: datetime
[17]:
# SQLite (serialized storage)
import sqlite3
with sqlite3.connect(":memory:") as conn:
ts1.to_sqlite(conn, series_id="my_sensor")
ts_sql = TimeSeries.from_sqlite(conn, series_id="my_sensor")
print(f"SQLite interop OK: {ts_sql.name}")
display(conn)
SQLite interop OK: my_sensor
<sqlite3.Connection at 0x7f888c429a80>
[18]:
# Deep Learning (Torch)
try:
import os
import warnings
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
warnings.filterwarnings("ignore", category=UserWarning, module="google.protobuf")
warnings.filterwarnings("ignore", message="Protobuf gencode version")
import torch
_ = torch
t_torch = ts1.to_torch()
ts_f_torch = TimeSeries.from_torch(t_torch, t0=ts1.t0, dt=ts1.dt)
print(f"Torch interop OK (Shape: {t_torch.shape})")
display(t_torch)
except ImportError:
pass
Torch interop OK (Shape: torch.Size([500]))
tensor([ 0.0464, 0.6475, 1.0656, 0.9657, 0.3816, 0.0622, -0.6356, -1.0794,
-0.7308, -0.8552, 0.0488, 0.8327, 1.0911, 1.1001, 0.7533, 0.3071,
-0.6714, -1.0896, -1.1494, -0.4882, 0.0975, 0.5406, 0.7696, 0.9663,
0.4898, -0.0777, -0.6696, -1.1976, -0.7228, -0.2931, -0.0028, 0.2829,
0.9369, 0.6805, 0.4277, 0.0320, -0.4927, -0.7845, -1.0255, -0.6953,
-0.0411, 0.6839, 1.1269, 0.9615, 0.4438, -0.0907, -0.6406, -0.8447,
-1.0047, -0.5893, -0.0036, 0.4071, 0.4382, 0.7529, 0.5119, -0.1262,
-0.7335, -1.1044, -1.0567, -0.7100, -0.1608, 0.5598, 0.6929, 1.2075,
0.3549, 0.2035, -0.3401, -0.7552, -1.3182, -0.8315, 0.0933, 0.6035,
0.5282, 0.7698, 0.8536, -0.4679, -0.6726, -0.7438, -1.0586, -0.5111,
0.1467, 0.3026, 1.1198, 0.9874, 0.7874, 0.0307, -0.9738, -0.9570,
-0.6891, -0.4521, -0.0571, 0.6273, 1.1553, 1.0336, 0.3556, -0.1351,
-0.2875, -0.9143, -1.2683, -0.0608, 0.0245, 0.9128, 0.8705, 0.9398,
0.4296, 0.0929, -0.6372, -1.0289, -0.9222, -0.4074, -0.0777, 0.8234,
1.2523, 1.0361, 0.6642, -0.4549, -0.6332, -0.8189, -1.1706, -0.4758,
0.2472, 0.8005, 0.6308, 1.0934, 0.7929, -0.1512, -0.1754, -0.7980,
-0.9528, -0.3716, -0.0251, 0.8563, 0.9884, 0.8112, 0.6516, -0.2263,
-0.4570, -1.0308, -0.7368, -0.3230, 0.0844, 0.1085, 0.9761, 0.8515,
0.6489, 0.1840, -0.4756, -1.2669, -1.0958, -0.7626, -0.2226, 0.7576,
0.4219, 0.8982, 0.8639, -0.0034, -0.8159, -1.1682, -0.9241, -0.5962,
-0.1423, 0.9597, 1.1992, 0.9319, 0.9600, -0.1788, -0.4176, -0.8910,
-1.0888, -0.7202, -0.0083, 0.7785, 1.0403, 0.9340, 0.5111, -0.1721,
-0.3285, -1.0769, -1.1836, -0.4435, 0.0806, 0.6707, 0.8159, 1.1654,
0.3574, 0.2168, -0.4858, -1.2450, -0.9495, -0.5682, -0.2319, 0.3171,
0.9825, 1.0912, 0.6020, -0.3807, -0.2687, -1.0019, -0.9070, -0.5297,
-0.4362, 0.6717, 0.4850, 0.7797, 0.6272, -0.1340, -0.3801, -0.9708,
-1.1015, -0.6837, 0.3875, 0.4940, 1.4391, 1.0907, 0.3844, 0.0291,
-0.5800, -0.9723, -0.9012, -0.5287, 0.1729, 0.6809, 0.7962, 1.0377,
0.5088, -0.3007, -0.2109, -0.9188, -1.1525, -0.3426, 0.1059, 0.5113,
0.6266, 0.8849, 0.4342, -0.0344, -0.6040, -0.9961, -1.1623, -0.8121,
-0.2505, 0.8045, 1.1004, 0.6716, 0.3631, -0.0280, -0.8513, -0.8406,
-1.0045, -0.7484, 0.0428, 0.6988, 1.2261, 1.0643, 0.6354, 0.0596,
-0.5272, -1.0956, -0.7236, -0.5869, 0.2148, 0.6463, 1.1702, 0.8724,
0.4414, 0.0235, -0.5383, -0.7801, -1.1182, -0.7203, 0.2195, 0.8072,
0.6978, 1.0616, 0.8328, -0.1168, -0.4378, -0.7956, -1.2464, -0.3815,
-0.1001, 0.3811, 0.5560, 0.9901, 0.7981, -0.0188, -0.3356, -1.1124,
-1.2726, -0.6649, -0.1109, 0.7850, 1.1996, 0.8740, 0.9379, -0.1293,
-0.5823, -1.0590, -0.7796, -0.8226, -0.0406, 0.4633, 1.0267, 0.9550,
0.5572, -0.1142, -0.9954, -0.9940, -1.0040, -0.6570, 0.0931, 0.2487,
0.8043, 0.9719, 0.5296, -0.3353, -0.9096, -1.2257, -0.9117, -0.4942,
-0.0938, 0.7561, 1.0658, 1.1588, 0.5632, -0.1268, -0.4705, -0.8734,
-0.8987, -0.9596, -0.1707, 0.3471, 0.8704, 0.7419, 0.6649, 0.3992,
-0.4781, -0.8181, -0.8553, -0.6102, 0.0814, 0.7883, 1.2576, 1.0262,
0.7470, 0.0896, -0.3859, -1.1007, -1.0906, -0.7712, 0.2559, 1.0653,
0.9275, 0.8464, 0.6607, 0.1129, -0.9892, -0.6975, -0.7020, -0.4851,
-0.2541, 0.8015, 0.6543, 1.1418, 0.7448, 0.0133, -0.7011, -0.5372,
-0.8241, -0.6561, 0.1557, 0.4652, 0.5265, 0.9202, 0.5620, -0.2656,
-0.2747, -0.5983, -1.0008, -0.6513, -0.0123, 0.6977, 0.8989, 0.9509,
0.4166, -0.0762, -0.5200, -1.0243, -0.6268, -0.6950, -0.0523, 0.5078,
0.7747, 1.3156, 0.7636, 0.3180, -0.6924, -0.7782, -1.5194, -0.3771,
0.1046, 0.4746, 1.1452, 1.2266, 0.6404, 0.0953, -0.3422, -1.0153,
-0.9000, -0.4180, 0.0028, 0.6386, 1.0614, 0.8799, 0.5019, 0.0206,
-0.2362, -1.0362, -0.8191, -0.3009, -0.0230, 0.5445, 1.1566, 1.3140,
0.8786, 0.2502, -0.4860, -1.1523, -1.1008, -0.7689, 0.2256, 1.1344,
1.0777, 0.7569, 0.3501, 0.1471, -0.6135, -1.0268, -1.3333, -0.5307,
0.0433, 0.6135, 0.8277, 0.9068, 0.1230, 0.1666, -0.7289, -1.0812,
-0.9220, -0.5599, 0.0956, 0.4025, 1.1383, 1.1862, 0.6867, 0.1058,
-0.5614, -1.0379, -1.3037, -0.3751, 0.0925, 0.5991, 1.1272, 0.9814,
0.4477, -0.2624, -0.5834, -0.7528, -0.6734, -0.7800, -0.1239, 0.5857,
0.7691, 0.7799, 0.7076, -0.0904, -0.3474, -0.4404, -1.0781, -0.6404,
-0.0728, 0.3693, 0.9677, 1.2791, 0.4837, -0.0032, -0.6991, -0.9540,
-0.9075, -0.5778, 0.0288, 0.6943, 1.1160, 0.6975, 0.4226, -0.0449,
-1.0785, -1.0706, -1.4179, -0.8140], dtype=torch.float64)
[19]:
# Deep Learning (TensorFlow)
try:
import os
import warnings
os.environ.setdefault("TF_CPP_MIN_LOG_LEVEL", "3")
warnings.filterwarnings(
"ignore", category=UserWarning, module=r"google\.protobuf\..*"
)
warnings.filterwarnings(
"ignore", category=UserWarning, message=r"Protobuf gencode version.*"
)
import tensorflow as tf
_ = tf
t_tf = ts1.to_tensorflow()
ts_f_tf = TimeSeries.from_tensorflow(t_tf, t0=ts1.t0, dt=ts1.dt)
print("TensorFlow interop OK")
display(t_tf)
except ImportError:
pass
[20]:
# ObsPy (seismic waveform and time series analysis)
try:
import os
import warnings
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
warnings.filterwarnings("ignore", category=UserWarning, module="google.protobuf")
warnings.filterwarnings("ignore", message="Protobuf gencode version")
import obspy
_ = obspy
tr = ts1.to_obspy()
ts_f_obspy = TimeSeries.from_obspy(tr)
print(f"ObsPy interop OK: {tr.id}")
display(tr)
except ImportError:
pass
ObsPy interop OK: .Sensor 1..
.Sensor 1.. | 1980-01-06T00:00:00.000000Z - 1980-01-06T00:00:04.990000Z | 100.0 Hz, 500 samples
Summary
In this tutorial, we have covered the key enhancements in gwexpy.TimeSeries:
Signal Processing: Hilbert transform, demodulation, and extended FFT modes.
Statistics: Advanced correlation methods (MIC, Distance Correlation) and ARIMA modeling.
Data Cleaning: Missing value imputation and standardization.
Interoperability: Seamless conversion to/from Pandas, Xarray, PyTorch, TensorFlow, and ObsPy.
Next Steps
Multichannel Data: Learn about TimeSeriesMatrix for handling many channels simultaneously.
Spectral Analysis: Explore FrequencySeries.
4D Fields: Check out ScalarField for spacetime analysis.