Repository: meyiao/ImuFusion
Branch: master
Commit: 23554fb4df3a
Files: 10
Total size: 17.9 KB
Directory structure:
gitextract_3433hldn/
├── README.md
├── SixDofAnimation.m
├── compVq.m
├── data/
│ ├── SpiralStairs.mat
│ ├── StairsAndCorridor.mat
│ ├── StraightLine.mat
│ ├── attitude_data.mat
│ └── readme.txt
├── orien.m
└── zupt.m
================================================
FILE CONTENTS
================================================
================================================
FILE: README.md
================================================
# ImuFusion
## EKF IMU Fusion Algorithms
1. orien.m uses Kalman filter for fusing the gyroscope's and accelerometer's readings to get the IMU's attitude(quaternion).<br>
2. zupt.m implenments the so called 'zero-velocity-update' algorithm for pedestrian tracking(gait tracking), it's also a ekf filter.<br>
* Video: http://v.youku.com/v_show/id_XMTg2NjI4NTI4NA==.html
## Usage
Example data already included.<br>
Simply run the orien.m or zupt.m. For zupt, set 'CreateVideo' as true if you'd like to save the results as a video.<br>
Note that the datasets and the code for visualizing the results were from:
https://github.com/xioTechnologies/Gait-Tracking-With-x-IMU
## References:
[1] S. Madgwick. An efficient orientation filter for inertial and inertial/magnetic sensor arrays.<br>
[2] Fischer C, et. Implementing a Pedestrian Tracker Using inertial Sensors.<br>
[3] Isaac Skog, et. Zero-Velocity Detection — An Algorithm Evaluation.<br>
================================================
FILE: SixDofAnimation.m
================================================
% Note that this animation code was stolen from:
% https://github.com/xioTechnologies/Gait-Tracking-With-x-IMU
function fig = SixDOFanimation(varargin)
%% Create local variables
% Required arguments
p = varargin{1}; % position of body
R = varargin{2}; % rotation matrix of body
[numSamples dummy] = size(p);
% Default values of optional arguments
SamplePlotFreq = 1;
Trail = 'Off';
LimitRatio = 1;
Position = [];
FullScreen = false;
View = [30 20];
AxisLength = 1;
ShowArrowHead = 'on';
Xlabel = 'X';
Ylabel = 'Y';
Zlabel = 'Z';
Title = '6DOF Animation';
ShowLegend = true;
write_video = false;
for i = 3:2:nargin
if strcmp(varargin{i}, 'SamplePlotFreq'), SamplePlotFreq = varargin{i+1};
elseif strcmp(varargin{i}, 'Trail')
Trail = varargin{i+1};
if(~strcmp(Trail, 'Off') && ~strcmp(Trail, 'DotsOnly') && ~strcmp(Trail, 'All'))
error('Invalid argument. Trail must be ''Off'', ''DotsOnly'' or ''All''.');
end
elseif strcmp(varargin{i}, 'LimitRatio'), LimitRatio = varargin{i+1};
elseif strcmp(varargin{i}, 'Position'), Position = varargin{i+1};
elseif strcmp(varargin{i}, 'FullScreen'), FullScreen = varargin{i+1};
elseif strcmp(varargin{i}, 'View'), View = varargin{i+1};
elseif strcmp(varargin{i}, 'AxisLength'), AxisLength = varargin{i+1};
elseif strcmp(varargin{i}, 'ShowArrowHead'), ShowArrowHead = varargin{i+1};
elseif strcmp(varargin{i}, 'Xlabel'), Xlabel = varargin{i+1};
elseif strcmp(varargin{i}, 'Ylabel'), Ylabel = varargin{i+1};
elseif strcmp(varargin{i}, 'Zlabel'), Zlabel = varargin{i+1};
elseif strcmp(varargin{i}, 'Title'), Title = varargin{i+1};
elseif strcmp(varargin{i}, 'ShowLegend'), ShowLegend = varargin{i+1};
elseif strcmp(varargin{i}, 'CreateVideo'), write_video = varargin{i+1};
else error('Invalid argument.');
end
end;
%% Reduce data to samples to plot only
p = p(1:SamplePlotFreq:numSamples, :);
R = R(:, :, 1:SamplePlotFreq:numSamples) * AxisLength;
if(numel(View) > 2)
View = View(1:SamplePlotFreq:numSamples, :);
end
[numPlotSamples dummy] = size(p);
%% Setup AVI file
% % % write video
if write_video
writeObj = VideoWriter('results');
writeObj.FrameRate = 60;
open(writeObj);
end
%% Setup figure and plot
% Create figure
fig = figure();
if(FullScreen)
screenSize = get(0, 'ScreenSize');
set(fig, 'Position', [0 0 screenSize(3) screenSize(4)]);
elseif(~isempty(Position))
set(fig, 'Position', Position);
end
set(gca, 'drawmode', 'fast');
lighting phong;
set(gcf, 'Renderer', 'zbuffer');
hold on;
axis equal;
grid on;
view(View(1, 1), View(1, 2));
title(i);
xlabel(Xlabel);
ylabel(Ylabel);
zlabel(Zlabel);
% Create plot data arrays
if(strcmp(Trail, 'DotsOnly') || strcmp(Trail, 'All'))
x = zeros(numPlotSamples, 1);
y = zeros(numPlotSamples, 1);
z = zeros(numPlotSamples, 1);
end
if(strcmp(Trail, 'All'))
ox = zeros(numPlotSamples, 1);
oy = zeros(numPlotSamples, 1);
oz = zeros(numPlotSamples, 1);
ux = zeros(numPlotSamples, 1);
vx = zeros(numPlotSamples, 1);
wx = zeros(numPlotSamples, 1);
uy = zeros(numPlotSamples, 1);
vy = zeros(numPlotSamples, 1);
wy = zeros(numPlotSamples, 1);
uz = zeros(numPlotSamples, 1);
vz = zeros(numPlotSamples, 1);
wz = zeros(numPlotSamples, 1);
end
x(1) = p(1,1);
y(1) = p(1,2);
z(1) = p(1,3);
ox(1) = x(1);
oy(1) = y(1);
oz(1) = z(1);
ux(1) = R(1,1,1:1);
vx(1) = R(2,1,1:1);
wx(1) = R(3,1,1:1);
uy(1) = R(1,2,1:1);
vy(1) = R(2,2,1:1);
wy(1) = R(3,2,1:1);
uz(1) = R(1,3,1:1);
vz(1) = R(2,3,1:1);
wz(1) = R(3,3,1:1);
% Create graphics handles
orgHandle = plot3(x, y, z, 'k.');
if(ShowArrowHead)
ShowArrowHeadStr = 'on';
else
ShowArrowHeadStr = 'off';
end
quivXhandle = quiver3(ox, oy, oz, ux, vx, wx, 'r', 'ShowArrowHead', ShowArrowHeadStr, 'MaxHeadSize', 0.999999, 'AutoScale', 'off');
quivYhandle = quiver3(ox, oy, oz, uy, vy, wy, 'g', 'ShowArrowHead', ShowArrowHeadStr, 'MaxHeadSize', 0.999999, 'AutoScale', 'off');
quivZhandle = quiver3(ox, ox, oz, uz, vz, wz, 'b', 'ShowArrowHead', ShowArrowHeadStr, 'MaxHeadSize', 0.999999, 'AutoScale', 'off');
% Create legend
if(ShowLegend)
legend('Origin', 'X', 'Y', 'Z');
end
% Set initial limits
Xlim = [x(1)-AxisLength x(1)+AxisLength] * LimitRatio;
Ylim = [y(1)-AxisLength y(1)+AxisLength] * LimitRatio;
Zlim = [z(1)-AxisLength z(1)+AxisLength] * LimitRatio;
set(gca, 'Xlim', Xlim, 'Ylim', Ylim, 'Zlim', Zlim);
% Set initial view
view(View(1, :));
%% Plot one sample at a time
for i = 1:numPlotSamples
% Update graph title
if(strcmp(Title, ''))
titleText = sprintf('Sample %i of %i', 1+((i-1)*SamplePlotFreq), numSamples);
else
titleText = strcat(Title, ' (', sprintf('Sample %i of %i', 1+((i-1)*SamplePlotFreq), numSamples), ')');
end
title(titleText);
% Plot body x y z axes
if(strcmp(Trail, 'DotsOnly') || strcmp(Trail, 'All'))
x(1:i) = p(1:i,1);
y(1:i) = p(1:i,2);
z(1:i) = p(1:i,3);
else
x = p(i,1);
y = p(i,2);
z = p(i,3);
end
if(strcmp(Trail, 'All'))
ox(1:i) = p(1:i,1);
oy(1:i) = p(1:i,2);
oz(1:i) = p(1:i,3);
ux(1:i) = R(1,1,1:i);
vx(1:i) = R(2,1,1:i);
wx(1:i) = R(3,1,1:i);
uy(1:i) = R(1,2,1:i);
vy(1:i) = R(2,2,1:i);
wy(1:i) = R(3,2,1:i);
uz(1:i) = R(1,3,1:i);
vz(1:i) = R(2,3,1:i);
wz(1:i) = R(3,3,1:i);
else
ox = p(i,1);
oy = p(i,2);
oz = p(i,3);
ux = R(1,1,i);
vx = R(2,1,i);
wx = R(3,1,i);
uy = R(1,2,i);
vy = R(2,2,i);
wy = R(3,2,i);
uz = R(1,3,i);
vz = R(2,3,i);
wz = R(3,3,i);
end
set(orgHandle, 'xdata', x, 'ydata', y, 'zdata', z);
set(quivXhandle, 'xdata', ox, 'ydata', oy, 'zdata', oz,'udata', ux, 'vdata', vx, 'wdata', wx);
set(quivYhandle, 'xdata', ox, 'ydata', oy, 'zdata', oz,'udata', uy, 'vdata', vy, 'wdata', wy);
set(quivZhandle, 'xdata', ox, 'ydata', oy, 'zdata', oz,'udata', uz, 'vdata', vz, 'wdata', wz);
% Adjust axes for snug fit and draw
axisLimChanged = false;
if((p(i,1) - AxisLength) < Xlim(1)), Xlim(1) = p(i,1) - LimitRatio*AxisLength; axisLimChanged = true; end
if((p(i,2) - AxisLength) < Ylim(1)), Ylim(1) = p(i,2) - LimitRatio*AxisLength; axisLimChanged = true; end
if((p(i,3) - AxisLength) < Zlim(1)), Zlim(1) = p(i,3) - LimitRatio*AxisLength; axisLimChanged = true; end
if((p(i,1) + AxisLength) > Xlim(2)), Xlim(2) = p(i,1) + LimitRatio*AxisLength; axisLimChanged = true; end
if((p(i,2) + AxisLength) > Ylim(2)), Ylim(2) = p(i,2) + LimitRatio*AxisLength; axisLimChanged = true; end
if((p(i,3) + AxisLength) > Zlim(2)), Zlim(2) = p(i,3) + LimitRatio*AxisLength; axisLimChanged = true; end
if(axisLimChanged), set(gca, 'Xlim', Xlim, 'Ylim', Ylim, 'Zlim', Zlim); end
drawnow;
% Adjust view
if(numel(View) > 2)
view(View(i, :));
end
% Add frame to AVI object
if write_video
frame = getframe(fig);
writeVideo(writeObj, frame);
end
end
hold off;
% Close AVI file
if write_video
close(writeObj);
end
end
================================================
FILE: compVq.m
================================================
function Vq = compVq(q, a)
q1 = q(1);
q2 = q(2);
q3 = q(3);
q4 = q(4);
ax = a(1);
ay = a(2);
az = a(3);
v_dot_q1 = [2*ax*q1 - 2*ay*q4 + 2*az*q3;...
2*ax*q4 + 2*ay*q1 - 2*az*q2;...
2*ay*q2 - 2*ax*q3 + 2*az*q1];
v_dot_q2 = [2*ax*q2 + 2*ay*q3 + 2*az*q4;...
2*ax*q3 - 2*ay*q2 - 2*az*q1;...
2*ax*q4 + 2*ay*q1 - 2*az*q2];
v_dot_q3 = [2*ay*q2 - 2*ax*q3 + 2*az*q1;...
2*ax*q2 + 2*ay*q3 + 2*az*q4;...
2*ay*q4 - 2*ax*q1 - 2*az*q3];
v_dot_q4 = [2*az*q2 - 2*ay*q1 - 2*ax*q4;...
2*ax*q1 - 2*ay*q4 + 2*az*q3;...
2*ax*q2 + 2*ay*q3 + 2*az*q4];
Vq = [v_dot_q1, v_dot_q2, v_dot_q3, v_dot_q4];
================================================
FILE: data/readme.txt
================================================
Note that the StraightLine.mat, SpiralStairs.mat, StairAndCorridor.mat were created from the csv files provided by:
https://github.com/xioTechnologies/Gait-Tracking-With-x-IMU
================================================
FILE: orien.m
================================================
% This is an Extended Kalman Filter which fuses
% accelerometer & gyroscope readings
% to get the orientation.
% Written by @TanTanTan
% Note that we assume the biases of IMU do not change in the whole
% process. This is approximately true when the sensors operate in a steady
% state.
clear; close all; clc;
%--- Dataset parameters
deltat = 1/200; % Sampling period
noise_gyro = 2.4e-3; % Gyroscope noise(discrete), rad/s
noise_accel = 2.83e-2; % Accelerometer noise, m/s^2
gravity = 9.81007; % Gravity magnitude, m/s^2
%--- Load data
% Each row of IMU data :
% (timestamp, wx, wy, wz, ax, ay, az)
% Each row of Ground truth data :
% (time, position, quaternion, velocity, gyroscope bias, accelerometer bias)
data = load('data/attitude_data.mat');
imu_data = data.imu_data; % IMU readings
grt_data = data.grt_data; % Ground truth (GT)
grt_q = grt_data(:, 5:8); % GT quaternions
bias_w = grt_data(1, 12:14); % gyroscope bias
bias_a = grt_data(1, 15:17); % accelerometer bias
%--- Container of the results
N = length(imu_data);
allX = zeros(N, 4);
%--- Initialization
x = grt_q(1,:)'; % Initial state (quaternion)
P = 1e-10 * eye(4); % Initial covariance
allX(1,:) = x';
% ---Here we go !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
for k = 2 : N
%--- 1. Propagation --------------------------
% Gyroscope measurements
w = (imu_data(k-1, 2:4) + imu_data(k, 2:4))/2;
w = w - bias_w;
% Compute the F matrix
Omega =[0 -w(1) -w(2) -w(3);...
w(1) 0 w(3) -w(2); ...
w(2) -w(3) 0 w(1); ...
w(3) w(2) -w(1) 0 ];
F = eye(4) + deltat * Omega / 2;
% Compute the process noise Q
G = [-x(2) -x(3) -x(4); ...
x(1) -x(4) x(3); ...
x(4) x(1) -x(2); ...
-x(3) x(2) x(1)] / 2;
Q = (noise_gyro * deltat)^2 * (G * G');
% Propagate the state and covariance
x = F * x;
x = x / norm(x); % Normalize the quaternion
P = F * P * F' + Q;
%--- 2. Update----------------------------------
% Accelerometer measurements
a = imu_data(k, 5:7);
a = a - bias_a;
% We use the unit vector of acceleration as observation
ea = a' / norm(a);
ea_prediction = [2*(x(2)*x(4)-x(1)*x(3)); ...
2*(x(3)*x(4)+x(1)*x(2)); ...
x(1)^2-x(2)^2-x(3)^2+x(4)^2];
% Residual
y = ea - ea_prediction;
% Compute the measurement matrix H
H = 2*[-x(3) x(4) -x(1) x(2); ...
x(2) x(1) x(4) x(3); ...
x(1) -x(2) -x(3) x(4)];
% Measurement noise R
R_internal = (noise_accel / norm(a))^2 * eye(3);
R_external = (1-gravity/norm(a))^2 * eye(3);
R = R_internal + R_external;
% Update
S = H * P * H' + R;
K = P * H' / S;
x = x + K * y;
P = (eye(4) - K * H) * P;
% 3. Ending
x = x / norm(x); % Normalize the quaternion
P = (P + P') / 2; % Make sure that covariance matrix is symmetric
allX(k,:) = x'; % Save the result
end
%--- Compare the results with ground truth
q_Ws0 = quatinv(grt_q(1,:));
for i=1:N
grt_q(i,:) = quatmultiply(q_Ws0, grt_q(i,:));
allX(i,:) = quatmultiply(q_Ws0, allX(i,:));
end
[psi, theta, phi] = quat2angle(allX);
[grt_psi, grt_theta, grt_phi] = quat2angle(grt_q);
figure, hold on
plot(1:N, psi, 'r-.', 1:N, grt_psi, 'r');
plot(1:N, theta, 'g-.', 1:N, grt_theta, 'g');
plot(1:N, phi, 'b-.', 1:N, grt_phi, 'b');
================================================
FILE: zupt.m
================================================
% This implements a zero velocity update EKF algorithm, which can be used
% for pedestrian tracking.
% By @TanTanTan
% Email: jinaqiaoqiao@gmail.com
clear; close all; clc;
% --- Dataset parameters
dt = 1/256;
g = 9.81;
sigma_acc = 0.1; % accelerometer noise (m/s^2)
sigma_gyro = 0.1*pi/180; % gyroscope noise (rad/s)
sigma_vel = 0.01; % zero-velocity update measurement noise (m/s)
%% zero-velocity detector parameters
cova = 0.01^2;
covw = (0.1*pi/180)^2;
W = 5; % window size
gamma = 0.3e5;
%% read data (Choose one)
% % data = load('data/StaightLine.mat');
% % data = load('data/StairsAndCorridor.mat');
data = load('data/SpiralStairs.mat');
imu_data = data.imu_data';
N = size(imu_data, 2);
%% Since we have got all the data, we can run the zero velocity detection
%% for the whole dataset.
iszv = zeros(1, N);
T = zeros(1, N-W+1);
for k = 1:N-W+1
mean_a = mean(imu_data(1:3,k:k+W-1), 2);
for l = k:k+W-1
temp = imu_data(1:3,l) - g * mean_a / norm(mean_a);
T(k) = T(k) + imu_data(4:6,l)'*imu_data(4:6,l)/covw + temp'*temp/cova;
end
end
T = T./W;
for k = 1:size(T,2)
if T(k) < gamma
iszv(k:k+W-1) = ones(1,W);
end
end
%%
%=========================================================================%
%== Initialization =%
%=========================================================================%
% We require that the IMU to be at static for first one second. So we can
% estimate the initial roll/pitch angle, as well as the biases.
init_a = mean(imu_data(1:3,1:20),2);
init_a = init_a / norm(init_a);
init_psi = 0;
init_theta = -asin(init_a(1));
init_phi = atan2(init_a(2), init_a(3));
init_quat = angle2quat(init_psi, init_theta, init_phi);
% Estimate sensor bias.
Rsw = quat2dcm(init_quat);
as = Rsw * [0;0;g];
bias_a = mean(imu_data(1:3,1:500),2) - as;
bias_w = mean(imu_data(4:6,1:500),2);
% set the initial state vector
x = zeros(10,1);
x(7:10,1) = init_quat';
% set the initial covariance
P = diag([1e-10*ones(1,6), 1e-6*ones(1,4)]);
%
x_r = zeros(10,N);
x_r(:,1) = x;
% measurement matrix
H = [zeros(3), eye(3), zeros(3,4)];
R = sigma_vel^2 * eye(3);
%%
%=========================================================================%
%== Main Loop =%
%=========================================================================%
for k = 2:N
%% compute state transition matrix F and covariance Q
w = imu_data(4:6, k-1); % - bias_w;
quat = x(7:10,1);
a = imu_data(1:3, k-1); % - bias_a;
% continuous state transition matrix
Ow = [0 -w(1) -w(2) -w(3);...
w(1) 0 w(3) -w(2);...
w(2) -w(3) 0 w(1);...
w(3) w(2) -w(1) 0 ];
Vq = compVq(quat, a);
Fc = zeros(10);
Fc(1:3, 4:6) = eye(3);
Fc(4:10,7:10)= [Vq; 0.5*Ow];
% continuous process covariance
Gq = 0.5* [-quat(2) -quat(3) -quat(4); ...
quat(1) -quat(4) quat(3); ...
quat(4) quat(1) -quat(2); ...
-quat(3) quat(2) quat(1)];
Qc = zeros(10);
Qc(4:6, 4:6) = sigma_acc^2*eye(3);
Qc(7:10,7:10) = sigma_gyro^2*(Gq*Gq');
% discretilization
F = eye(10) + Fc* dt;
Q = Qc* dt;
%% state propagation
R_S_n = quat2dcm(quat');
acc = R_S_n' * a - [0; 0; g];
x(1:3) = x(1:3) + x(4:6)* dt + 0.5*acc* dt^2;
x(4:6) = x(4:6) + acc* dt;
quat = (eye(4) + 0.5*Ow* dt)*quat;
quat = quat/norm(quat);
x(7:10) = quat;
%% covariance propagation
P = F*P*F' + Q;
%% zero-velocity update
if iszv(k) == 1
K = (P*H')/(H*P*H'+R);
y = -x(4:6);
x = x + K*y;
x(7:10) = x(7:10)/norm(x(7:10));
P = (eye(10)-K*H)*P;
end
P = (P+P')/2;
x_r(:,k) = x;
end
%% View the results
%% Plot the trajectory
figure(1), hold on
plot(x_r(1,:),x_r(2,:));
plot(x_r(1,1),x_r(2,1),'ro');
legend('Trajectory','Start point')
axis equal
grid on
figure(2),
plot(1:N,x_r(3,:));
title('Height')
grid on
%% Animation
%% The animation code was stolen from xioTechnologies.
%% https://github.com/xioTechnologies/Gait-Tracking-With-x-IMU
L = size(x_r,2);
SamplePlotFreq = 4;
Spin = 120;
SixDofAnimation(x_r(1:3,:)', quat2dcm(quatinv(x_r(7:10,:)')), ...
'SamplePlotFreq', SamplePlotFreq, 'Trail', 'All',...
'Position', [9 39 1280 768],...
'View', [(100:(Spin/(L-1)):(100+Spin))', 10*ones(L, 1)],...
'AxisLength', 0.1, 'ShowArrowHead', false, ...
'Xlabel', 'X (m)', 'Ylabel', 'Y (m)', 'Zlabel', 'Z (m)',...
'ShowLegend', false,...
'CreateVideo', false);
gitextract_3433hldn/ ├── README.md ├── SixDofAnimation.m ├── compVq.m ├── data/ │ ├── SpiralStairs.mat │ ├── StairsAndCorridor.mat │ ├── StraightLine.mat │ ├── attitude_data.mat │ └── readme.txt ├── orien.m └── zupt.m
Condensed preview — 10 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (19K chars).
[
{
"path": "README.md",
"chars": 944,
"preview": "# ImuFusion\n## EKF IMU Fusion Algorithms\n1. orien.m uses Kalman filter for fusing the gyroscope's and accelerometer's re"
},
{
"path": "SixDofAnimation.m",
"chars": 8079,
"preview": "% Note that this animation code was stolen from:\n% https://github.com/xioTechnologies/Gait-Tracking-With-x-IMU\n\nfunction"
},
{
"path": "compVq.m",
"chars": 686,
"preview": "function Vq = compVq(q, a)\n\nq1 = q(1);\nq2 = q(2);\nq3 = q(3);\nq4 = q(4);\n\nax = a(1);\nay = a(2);\naz = a(3);\n\nv_dot_q1 = [2"
},
{
"path": "data/readme.txt",
"chars": 176,
"preview": "Note that the StraightLine.mat, SpiralStairs.mat, StairAndCorridor.mat were created from the csv files provided by:\nhttp"
},
{
"path": "orien.m",
"chars": 3596,
"preview": "% This is an Extended Kalman Filter which fuses \n% accelerometer & gyroscope readings\n% to get the orientation.\n% Writte"
},
{
"path": "zupt.m",
"chars": 4861,
"preview": "% This implements a zero velocity update EKF algorithm, which can be used\n% for pedestrian tracking.\n% By @TanTanTan\n% E"
}
]
// ... and 4 more files (download for full content)
About this extraction
This page contains the full source code of the meyiao/ImuFusion GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 10 files (17.9 KB), approximately 6.5k tokens. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.