Merge branch 'release/v2.4.2' into develop

This commit is contained in:
TinyO 2019-09-02 10:19:27 +08:00
commit a3f6c1fd9f
19 changed files with 504 additions and 27 deletions

View File

@ -38,7 +38,7 @@ PROJECT_NAME = "MYNT EYE S SDK"
# could be handy for archiving the generated documentation or if some version
# control system is used.
PROJECT_NUMBER = 2.4.1
PROJECT_NUMBER = 2.4.2
# Using the PROJECT_BRIEF tag one can provide an optional one line description
# for a project that appears at the top of each page and should give viewer a

View File

@ -3,7 +3,7 @@
Update Auxiliary Chip Firmware
==================================
Update auxiliary chip (Only Support S2100/S210A)
Update auxiliary chip (Only Support S21XX)
------------------------------------------------
* Plug in the MYNT® EYE camera into a USB3.0 port

View File

@ -1,7 +1,7 @@
.. _mynteye_spec_s2110:
S2110-146/Color Product Specification
S2110-95/Color Product Specification
========================================
Product Specification
@ -9,7 +9,7 @@ Product Specification
========================== =====================================================
Model S2110-146/Color
Model S2110-95/Color
-------------------------- -----------------------------------------------------
Size PCB dimension:15x100mm
Total dimension:125x47x26.6mm

View File

@ -13,7 +13,7 @@ For mynteye s1030, the settings available for adjustment during auto exposure ar
* ``Option::MAX_EXPOSURE_TIME`` Maximum exposure time.
* ``Option::DESIRED_BRIGHTNESS`` Expected brightness.
For mynteye s2100/s210a, the settings available for adjustment during auto exposure are:
For mynteye s21XX, the settings available for adjustment during auto exposure are:
* ``Option::MAX_GAIN`` Maximum gain.
* ``Option::MAX_EXPOSURE_TIME`` Maximum exposure time.
@ -45,7 +45,7 @@ s1030
LOG(INFO) << "Set DESIRED_BRIGHTNESS to "
<< api->GetOptionValue(Option::DESIRED_BRIGHTNESS);
s2100/s210a
s21XX
.. code-block:: c++
@ -97,7 +97,7 @@ s1030
I0513 14:07:58.521375 31845 auto_exposure.cc:41] Set DESIRED_BRIGHTNESS to 192
s2100/s210a
s21XX
.. code-block:: bash

View File

@ -12,7 +12,7 @@ For mynteye s1030, to set the image frame rate and IMU frequency, set ``Option::
* The effective fps of the image: 10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60.
* The effective frequency of IMU: 100, 200, 250, 333, 500.
For mynteye s2100/s210a, the image frame rate should be selected when running the sample, and the frame rate and resolution are combined as follows:
For mynteye s21XX, the image frame rate should be selected when running the sample, and the frame rate and resolution are combined as follows:
.. code-block:: bash
@ -45,7 +45,7 @@ s1030
LOG(INFO) << "Set IMU_FREQUENCY to "
<< api->GetOptionValue(Option::IMU_FREQUENCY);
s2100/s210a
s21XX
.. code-block:: c++
@ -78,7 +78,7 @@ s1030
I0513 14:06:21.702388 31813 framerate.cc:85] Img count: 573, fps: 24.6122
I0513 14:06:21.702404 31813 framerate.cc:87] Imu count: 11509, hz: 494.348
s2100/s210a
s21XX
.. code-block:: bash

View File

@ -9,11 +9,11 @@ To set the IIC address, set ``Option::IIC_ADDRESS_SETTING``.
.. Attention::
Only support S210A/2100
Only support S21XX
Reference Code:
s210a/s2100
s2XX
.. code-block:: c++
@ -33,7 +33,7 @@ s210a/s2100
Reference running results on Linux:
s210a/s2100
s21XX
.. code-block:: bash

View File

@ -13,7 +13,7 @@ To set the range of accelerometer and gyroscope, set ``Option::ACCELEROMETER_RAN
* The effective range of accelerometer(unit:g): 4, 8, 16, 32.
* Gyroscope Range Valid value (unit: DEG/S): 500, 1000, 2000, 4000.
For mynteye s2100/s210a, the available settings are:
For mynteye s21XX, the available settings are:
* The effective range of accelerometer(unit:g): 6, 12, 24, 48.
* The effective range of gyroscope(unit:deg/s): 250, 500, 1000, 2000, 4000.
@ -38,7 +38,7 @@ s1030
LOG(INFO) << "Set GYROSCOPE_RANGE to "
<< api->GetOptionValue(Option::GYROSCOPE_RANGE);
s2100/s210a
s21XX
.. code-block:: c++
@ -78,7 +78,7 @@ s1030
I/imu_range.cc:84 Img count: 363, fps: 25.0967
I/imu_range.cc:86 Imu count: 2825, hz: 195.312
s2100/s210a
s21XX
.. code-block:: bash

View File

@ -8,7 +8,7 @@ Using the ``SetOptionValue()`` function of the API, you can set various control
Enabling IR is setting ``Option::IR_CONTROL`` greater than 0. The greater the value, the greater the IR's intensity.
.. Attention::
* mynteye s2100/s210a doesn't support this feature.
* mynteye s21XX doesn't support this feature.
Reference Code:
@ -49,6 +49,6 @@ At this point, if the image is displayed, you can see IR speckle on the image, a
.. attention::
The hardware will not record the IR value after being turned off. In order to keep IR enabled, you must set the IR value after turning on the device.
The hardware will not record the IR value after being turned off and will reset to 0. In order to keep IR enabled, you must set the IR value after turning on the device.
Complete code samplessee `ctrl_infrared.cc <https://github.com/slightech/MYNT-EYE-S-SDK/blob/master/samples/ctrl_infrared.cc>`_ .

View File

@ -13,7 +13,7 @@ For mynteye s1030, during manual exposure, the settings available for adjustment
* ``Option::BRIGHTNESS`` Brightness (Exposure time).
* ``Option::CONTRAST`` Contrast (Black level calibration).
For mynteye s2100/s210a, during manual exposure, the settings available for adjustment are:
For mynteye s21XX, during manual exposure, the settings available for adjustment are:
* ``Option::BRIGHTNESS`` Brightness (Exposure time).
@ -41,7 +41,7 @@ s1030
LOG(INFO) << "Set BRIGHTNESS to " << api->GetOptionValue(Option::BRIGHTNESS);
LOG(INFO) << "Set CONTRAST to " << api->GetOptionValue(Option::CONTRAST);
s2100/s210a
s21XX
.. code-block:: c++
@ -81,7 +81,7 @@ s1030
I0513 14:09:17.552958 31908 manual_exposure.cc:39] Set BRIGHTNESS to 120
I0513 14:09:17.552963 31908 manual_exposure.cc:40] Set CONTRAST to 116
s2100/s210a
s21XX
.. code-block:: bash

View File

@ -12,8 +12,12 @@ SDK Data Samples
get_stereo_rectified
get_disparity
get_depth
get_points
get_imu
get_imu_correspondence
get_from_callbacks
get_with_plugin
save_params
save_single_image
write_img_params
write_imu_params

View File

@ -7,7 +7,7 @@ Use ``GetIntrinsics()`` & ``GetExtrinsics()`` to get image calibration parameter
.. tip::
The detailed meaning of parameters can reference the files in ``tools/writer/config`` , of these
the image calibration parameters of S2100/S210A are in ``tools/writer/config/S210A``
the image calibration parameters of S21XX are in ``tools/writer/config/S21XX``
the image calibration parameters of S1030 are in ``tools/writer/config/S1030``
Note

View File

@ -0,0 +1,56 @@
.. _data_get_points:
Get Point Image
================
Point images belongs to upper layer of synthetic data. To get this kind of data through ``GetStreamData()``, you need to start the ``EnableStreamData()`` beforehand. It should be check not empty before use.
For detail process description, please see :ref:`get_stereo` :ref:`get_stereo_rectified` .
Sample code snippet:
.. code-block:: c++
auto &&api = API::Create(argc, argv);
api->EnableStreamData(Stream::POINTS);
api->Start(Source::VIDEO_STREAMING);
cv::namedWindow("frame");
PCViewer pcviewer;
while (true) {
api->WaitForStreams();
auto &&left_data = api->GetStreamData(Stream::LEFT);
auto &&right_data = api->GetStreamData(Stream::RIGHT);
cv::Mat img;
cv::hconcat(left_data.frame, right_data.frame, img);
cv::imshow("frame", img);
auto &&points_data = api->GetStreamData(Stream::POINTS);
if (!points_data.frame.empty()) {
pcviewer.Update(points_data.frame);
}
char key = static_cast<char>(cv::waitKey(1));
if (key == 27 || key == 'q' || key == 'Q') { // ESC/Q
break;
}
if (pcviewer.WasStopped()) {
break;
}
}
api->Stop(Source::VIDEO_STREAMING);
`PCL <https://github.com/PointCloudLibrary/pcl>`_ is used to display point images above. Program will close when point image window is closed.
Complete code examples, see `get_depth_and_points.cc <https://github.com/slightech/MYNT-EYE-S-SDK/blob/master/samples/get_depth_and_points.cc>`_ .
.. attention::
Sample code only compiles when `PCL <https://github.com/PointCloudLibrary/pcl>`_ is ready. If your PCL was installed in a different directory, please set ``CMAKE_PREFIX_PATH`` in `tutorials/CMakeLists.txt <https://github.com/slightech/MYNT-EYE-S-SDK/blob/master/samples/tutorials/CMakeLists.txt>`_ to the path of ``PCLConfig.cmake`` . You can find ``CMAKE_PREFIX_PATH`` near ``find_package(PCL)`` .

View File

@ -24,7 +24,7 @@ And, `samples/config/S1030/img.params.pinhole <https://github.com/slightech/MYNT
.. tip::
The image calibration parameters of S2100/S210A are in ``samples/config/S210A``
The image calibration parameters of S21XX are in ``samples/config/S21XX``
The image calibration parameters of S1030 are in ``samples/config/S1030``
.. tip::

View File

@ -16,7 +16,7 @@ Reference commands:
# Windows
.\samples\_output\bin\write_imu_params.bat samples\config\imu.params
The path of parameters file can be found in `samples/config/img.params <https://github.com/slightech/MYNT-EYE-S-SDK/blob/master/samples/config/img.params>`_ . If you calibrated the parameters yourself, you can edit the file and run above commands to write them into the device.
The path of parameters folder can be found in `samples/config/imu.params <https://github.com/slightech/MYNT-EYE-S-SDK/blob/master/samples/config>`_ . If you calibrated the parameters yourself, you can edit the file and run above commands to write them into the device.
.. warning::

View File

@ -54,7 +54,7 @@ The ROS file is structured like follows:
│ ├─config/
│ │ ├─device/
│ │ ├─standard.yaml # S1030
│ │ └─standard2.yaml # S2100/S210A
│ │ └─standard2.yaml # S21XX
│ │ ├─laserscan/
│ │ ├─process/
│ │ └─...
@ -72,14 +72,14 @@ The ROS file is structured like follows:
│ └─package.xml
└─README.md
In ``mynteye.launch``, you can configure the topics and frame_ids, decide which data to enable, ``standard.yaml`` (standard2.yaml is S2100/S210A config file) can set parameters for device. Please set ``gravity`` to the local gravity acceleration.
In ``mynteye.launch``, you can configure the topics and frame_ids, decide which data to enable, ``standard.yaml`` (standard2.yaml is S21XX config file) can set parameters for device. Please set ``gravity`` to the local gravity acceleration.
standard.yaml/standard2.yaml:
.. code-block:: xml
# s2100/s210a modify frame/resolution
# s21XX modify frame/resolution
standard2/request_index: 2
# s1030 modify frame/imu hz

View File

@ -279,3 +279,11 @@ make_executable(save_all_infos
DLL_SEARCH_PATHS ${PRO_DIR}/_install/bin ${OpenCV_LIB_SEARCH_PATH}
)
## record
make_executable(record
SRCS record.cc dataset.cc
LINK_LIBS mynteye ${OpenCV_LIBS}
DLL_SEARCH_PATHS ${PRO_DIR}/_install/bin ${OpenCV_LIB_SEARCH_PATH}
)

203
samples/dataset.cc Normal file
View File

@ -0,0 +1,203 @@
// Copyright 2018 Slightech Co., Ltd. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include "dataset.h"
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/imgproc/imgproc.hpp>
#include <iomanip>
#include <limits>
#include <stdexcept>
#include <utility>
#include "mynteye/logger.h"
#include "mynteye/util/files.h"
#define FULL_PRECISION \
std::fixed << std::setprecision(std::numeric_limits<double>::max_digits10)
#define IMAGE_FILENAME_WIDTH 6
MYNTEYE_BEGIN_NAMESPACE
namespace tools {
Dataset::Dataset(std::string outdir) : outdir_(std::move(outdir)) {
VLOG(2) << __func__;
if (!files::mkdir(outdir_)) {
LOG(FATAL) << "Create directory failed: " << outdir_;
}
}
Dataset::~Dataset() {
VLOG(2) << __func__;
for (auto &&it = stream_writers_.begin(); it != stream_writers_.end(); it++) {
if (it->second) {
it->second->ofs.flush();
it->second->ofs.close();
}
}
if (motion_writer_) {
motion_writer_->ofs.flush();
motion_writer_->ofs.close();
}
}
void Dataset::SaveStreamData(
const Stream &stream, const device::StreamData &data) {
auto &&writer = GetStreamWriter(stream);
auto seq = stream_counts_[stream];
writer->ofs << seq << ", " << data.img->frame_id << ", "
<< data.img->timestamp << ", " << data.img->exposure_time
<< std::endl;
if (data.frame) {
std::stringstream ss;
ss << writer->outdir << MYNTEYE_OS_SEP << std::dec
<< std::setw(IMAGE_FILENAME_WIDTH) << std::setfill('0') << seq << ".png";
if (data.frame->format() == Format::GREY) {
cv::Mat img(
data.frame->height(), data.frame->width(), CV_8UC1,
data.frame->data());
cv::imwrite(ss.str(), img);
} else if (data.frame->format() == Format::YUYV) {
cv::Mat img(
data.frame->height(), data.frame->width(), CV_8UC2,
data.frame->data());
cv::cvtColor(img, img, cv::COLOR_YUV2BGR_YUY2);
cv::imwrite(ss.str(), img);
} else if (data.frame->format() == Format::BGR888) {
cv::Mat img(
data.frame->height(), data.frame->width(), CV_8UC3,
data.frame->data());
cv::imwrite(ss.str(), img);
} else {
cv::Mat img(
data.frame->height(), data.frame->width(), CV_8UC1,
data.frame->data());
cv::imwrite(ss.str(), img);
}
}
++stream_counts_[stream];
}
void Dataset::SaveMotionData(const device::MotionData &data) {
auto &&writer = GetMotionWriter();
// auto seq = data.imu->serial_number;
auto seq = motion_count_;
writer->ofs << seq << ", " << static_cast<int>(data.imu->flag) << ", "
<< data.imu->timestamp << ", " << data.imu->accel[0] << ", "
<< data.imu->accel[1] << ", " << data.imu->accel[2] << ", "
<< data.imu->gyro[0] << ", " << data.imu->gyro[1] << ", "
<< data.imu->gyro[2] << ", " << data.imu->temperature
<< std::endl;
++motion_count_;
/*
if(motion_count_ != seq) {
LOG(INFO) << "motion_count_ != seq !" << " motion_count_: " << motion_count_
<< " seq: " << seq;
motion_count_ = seq;
}
*/
}
void Dataset::SaveStreamData(
const Stream &stream, const api::StreamData &data) {
auto &&writer = GetStreamWriter(stream);
auto seq = stream_counts_[stream];
writer->ofs << seq << ", " << data.img->frame_id << ", "
<< data.img->timestamp << ", " << data.img->exposure_time
<< std::endl;
if (!data.frame.empty()) {
std::stringstream ss;
ss << writer->outdir << MYNTEYE_OS_SEP << std::dec
<< std::setw(IMAGE_FILENAME_WIDTH) << std::setfill('0') << seq << ".png";
cv::imwrite(ss.str(), data.frame);
}
++stream_counts_[stream];
}
void Dataset::SaveMotionData(const api::MotionData &data) {
auto &&writer = GetMotionWriter();
// auto seq = data.imu->serial_number;
auto seq = motion_count_;
if (data.imu->flag == 1 || data.imu->flag == 2) {
writer->ofs << seq << ", " << static_cast<int>(data.imu->flag) << ", "
<< data.imu->timestamp << ", " << data.imu->accel[0] << ", "
<< data.imu->accel[1] << ", " << data.imu->accel[2] << ", "
<< data.imu->gyro[0] << ", " << data.imu->gyro[1] << ", "
<< data.imu->gyro[2] << ", " << data.imu->temperature
<< std::endl;
++motion_count_;
}
/*
if(motion_count_ != seq) {
LOG(INFO) << "motion_count_ != seq !" << " motion_count_: " << motion_count_
<< " seq: " << seq;
motion_count_ = seq;
}
*/
}
Dataset::writer_t Dataset::GetStreamWriter(const Stream &stream) {
try {
return stream_writers_.at(stream);
} catch (const std::out_of_range &e) {
writer_t writer = std::make_shared<Writer>();
switch (stream) {
case Stream::LEFT: {
writer->outdir = outdir_ + MYNTEYE_OS_SEP "left";
} break;
case Stream::RIGHT: {
writer->outdir = outdir_ + MYNTEYE_OS_SEP "right";
} break;
default:
LOG(FATAL) << "Unsupported stream: " << stream;
}
writer->outfile = writer->outdir + MYNTEYE_OS_SEP "stream.txt";
files::mkdir(writer->outdir);
writer->ofs.open(writer->outfile, std::ofstream::out);
writer->ofs << "seq, frame_id, timestamp, exposure_time" << std::endl;
writer->ofs << FULL_PRECISION;
stream_writers_[stream] = writer;
stream_counts_[stream] = 0;
return writer;
}
}
Dataset::writer_t Dataset::GetMotionWriter() {
if (motion_writer_ == nullptr) {
writer_t writer = std::make_shared<Writer>();
writer->outdir = outdir_;
writer->outfile = writer->outdir + MYNTEYE_OS_SEP "motion.txt";
files::mkdir(writer->outdir);
writer->ofs.open(writer->outfile, std::ofstream::out);
writer->ofs << "seq, flag, timestamp, accel_x, accel_y, accel_z, "
"gyro_x, gyro_y, gyro_z, temperature"
<< std::endl;
writer->ofs << FULL_PRECISION;
motion_writer_ = writer;
motion_count_ = 0;
accel_count_ = 0;
gyro_count_ = 0;
}
return motion_writer_;
}
} // namespace tools
MYNTEYE_END_NAMESPACE

69
samples/dataset.h Normal file
View File

@ -0,0 +1,69 @@
// Copyright 2018 Slightech Co., Ltd. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef MYNTEYE_TOOLS_DATASET_H_ // NOLINT
#define MYNTEYE_TOOLS_DATASET_H_
#pragma once
#include <fstream>
#include <map>
#include <memory>
#include <string>
#include "mynteye/mynteye.h"
#include "mynteye/api/api.h"
#include "mynteye/device/callbacks.h"
MYNTEYE_BEGIN_NAMESPACE
namespace tools {
class Dataset {
public:
struct Writer {
std::ofstream ofs;
std::string outdir;
std::string outfile;
};
using writer_t = std::shared_ptr<Writer>;
explicit Dataset(std::string outdir);
~Dataset();
void SaveStreamData(const Stream &stream, const device::StreamData &data);
void SaveMotionData(const device::MotionData &data);
void SaveStreamData(const Stream &stream, const api::StreamData &data);
void SaveMotionData(const api::MotionData &data);
private:
writer_t GetStreamWriter(const Stream &stream);
writer_t GetMotionWriter();
std::string outdir_;
std::map<Stream, writer_t> stream_writers_;
writer_t motion_writer_;
std::map<Stream, std::size_t> stream_counts_;
std::size_t motion_count_;
std::size_t accel_count_;
std::size_t gyro_count_;
};
} // namespace tools
MYNTEYE_END_NAMESPACE
#endif // MYNTEYE_TOOLS_DATASET_H_ NOLINT

137
samples/record.cc Normal file
View File

@ -0,0 +1,137 @@
// Copyright 2018 Slightech Co., Ltd. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/imgproc/imgproc.hpp>
#include "mynteye/logger.h"
#include "mynteye/device/device.h"
#include "mynteye/device/utils.h"
#include "mynteye/util/times.h"
#include "dataset.h"
MYNTEYE_USE_NAMESPACE
int main(int argc, char *argv[]) {
glog_init _(argc, argv);
auto &&device = device::select();
if (!device) return 1;
bool ok;
auto &&request = device::select_request(device, &ok);
if (!ok) return 1;
device->ConfigStreamRequest(request);
device->LogOptionInfos();
// Enable this will cache the motion datas until you get them.
device->EnableMotionDatas();
device->Start(Source::ALL);
const char *outdir;
if (argc >= 2) {
outdir = argv[1];
} else {
outdir = "./dataset";
}
tools::Dataset dataset(outdir);
cv::namedWindow("frame");
std::size_t img_count = 0;
std::size_t imu_count = 0;
auto &&time_beg = times::now();
while (true) {
device->WaitForStreams();
auto &&left_datas = device->GetStreamDatas(Stream::LEFT);
auto &&right_datas = device->GetStreamDatas(Stream::RIGHT);
img_count += left_datas.size();
auto &&motion_datas = device->GetMotionDatas();
imu_count += motion_datas.size();
auto &&left_frame = left_datas.back().frame;
auto &&right_frame = right_datas.back().frame;
cv::Mat img;
if (left_frame->format() == Format::GREY) {
cv::Mat left_img(
left_frame->height(), left_frame->width(), CV_8UC1,
left_frame->data());
cv::Mat right_img(
right_frame->height(), right_frame->width(), CV_8UC1,
right_frame->data());
cv::hconcat(left_img, right_img, img);
} else if (left_frame->format() == Format::YUYV) {
cv::Mat left_img(
left_frame->height(), left_frame->width(), CV_8UC2,
left_frame->data());
cv::Mat right_img(
right_frame->height(), right_frame->width(), CV_8UC2,
right_frame->data());
cv::cvtColor(left_img, left_img, cv::COLOR_YUV2BGR_YUY2);
cv::cvtColor(right_img, right_img, cv::COLOR_YUV2BGR_YUY2);
cv::hconcat(left_img, right_img, img);
} else if (left_frame->format() == Format::BGR888) {
cv::Mat left_img(
left_frame->height(), left_frame->width(), CV_8UC3,
left_frame->data());
cv::Mat right_img(
right_frame->height(), right_frame->width(), CV_8UC3,
right_frame->data());
cv::hconcat(left_img, right_img, img);
} else {
return -1;
}
cv::imshow("frame", img);
if (img_count > 10 && imu_count > 50) { // save
for (auto &&left : left_datas) {
dataset.SaveStreamData(Stream::LEFT, left);
}
for (auto &&right : right_datas) {
dataset.SaveStreamData(Stream::RIGHT, right);
}
for (auto &&motion : motion_datas) {
dataset.SaveMotionData(motion);
}
std::cout << "\rSaved " << img_count << " imgs"
<< ", " << imu_count << " imus" << std::flush;
}
char key = static_cast<char>(cv::waitKey(1));
if (key == 27 || key == 'q' || key == 'Q') { // ESC/Q
break;
}
}
std::cout << " to " << outdir << std::endl;
auto &&time_end = times::now();
device->Stop(Source::ALL);
float elapsed_ms =
times::count<times::microseconds>(time_end - time_beg) * 0.001f;
LOG(INFO) << "Time beg: " << times::to_local_string(time_beg)
<< ", end: " << times::to_local_string(time_end)
<< ", cost: " << elapsed_ms << "ms";
LOG(INFO) << "Img count: " << img_count
<< ", fps: " << (1000.f * img_count / elapsed_ms);
LOG(INFO) << "Imu count: " << imu_count
<< ", hz: " << (1000.f * imu_count / elapsed_ms);
return 0;
}