1. Download the APP from the link above
2. Connect the Intel Realsense camera with the NUC computer via a usb cable
3. The provided NUC computer and the phone need to be connected to the same network (the hotspot of phone is the best choice)
The voice guidence avoiding abstacles system relys on the Intel Realsense camera, and this is the only button in the UI main page. When users need any help from voice guidence system, just press the button. (Video Explanation)
After pressing the main page button, the UI change to this page. It will vibrate to warn visually-impaired and they just need to say Instructions (Instruction lists), then the system will operate and help visually-impaired by different instructions that users say. (Video Explanation)
A brief video explanation of how to use the APP
After pressing the home button, the following instructions that users can say:
1. Start:
Start the voice guidence obstacle avoiding system
2. Stop:
Stop the voice guidence obstacle avoiding system
3. Volume (0-1):
Modify the voice volume of voice guidence
4. Pitch (0.5-2):
Modify the voice pitch of voice guidence
5. Speed (0-1):
Modify the voice speed of voice guidence
6. Vibration (on/off):
open/close the vibration function of the APP
7. Location:
When uses say the "Location", the system will
automatically search in the location database to find the nearest places with Intel cameres for providing wider range of the system.
Our code can be found on Github at this link: https://github.com/SightPlusPlus. As this project is quite big, we had to divide the code into multiple repositories.
SightPlusPlus-Web_portal
The repository named SightPlusPlus-Web_portal is a presentation website for developers. To see the website, just download the code and open index.html.
The project itself is placed inside the other 3 repositories. So to run it, you would need a Realsense camera and a NUC computer from Intel
(but the NUC computer is not mandatory, you can still run the code using your personal computer).
SightPlusPlus-Server
Firstly, let’s start by starting the C++ Server - SightPlusPlus-Server (which was already done by other Postgraduates students when we put our hands on this project).
*** NOTE 1: These explanations are taken from the ReadMe written by the old team.
*** NOTE 2: This system is meant to work on NUC computers. Because of this, you may have troubles running it on Apple computer as it should run on Windows computers.
We know that because we had troubles running it on our Apple laptops.
Installation and Development
This project uses Vcpkg for library dependencies. Install the following libraries with Vcpkg to run the solution (make sure to install the 64-bit versions, as this system is being developed with 64-bit systems in mind):
Installation and Run Manual of CMAKE
This section is an instruction guide on how to install and run CMake. This will include a guide for building in windows power shell as well as within Microsoft Visual Studio 2019. Running the system with CMake has the same requirements as with the original version. This guide includes the use of vcpkg, and it is assumed that CMake is installed on your system.
Open with VS
1. Startup Visual Studio and select "Open local Folder".
2. Navigate inside /Intel_SighPP_CMake and click select folder.
3. Right-click on the top-level CMakeLists.txt and select CMake Settings. This will open a new JSON page to instruct VS on how to build the system.
4. Following, Add the path to your vcpkg.cmake file to "CMake Toolchain file". This is roughly "/scritps/buildsystems/vcpkg.cmake".
5. You will now be able to compile the system given you have installed everything correctly.
6. Right-click on the top-level CMakeLists.txt and click Generate cache. This will build the base build instructions for the CMake system.
7. Finally, to build the executable under the build tab select rebuild all.
8. The executable files will now be within the ./build directory.
Running the software within VS
Now that you have built the CMake, you can run the system in debug mode from Visual Studio.
NOTE: You may need to copy the models folder to the location of the executable as this is needed, especially if you changed the build directory.
There are a few options.
Option 1, target view:
In this method, first, navigate to the solution explorer to the right to the home icon.Option 2:
To set the run arguments, right-click on the top-level CMake and select add build configuration, select default.Flags
Flags for running the system include:
Real Life approach
Connect the Realsense camera(s) and add the “realsense -outdoors” flags (for outdoors moving) or “realsense -indoors” flags (for indoors moving).
SightPlusPlus-Client
After installing and successfully running the SightPluslus-Server, it’s time for the Client-side.
Prerequisites
1. Install Node.js. Download the installation package from the official website of Node.js. Choose the LTS version. I am using NodeJS version 11.0, but any of them will work.
2. (For Windows) Add the installation directory into the environment variables of your OS.
Run it
After you download the project package and extract it, open CMD in the ‘client’ folder. Run these commands below:
npm install
IMPORTANT! Please run the SightPlusPlus-Server before running SightPlusPlus-Client. A Connect Error will appear otherwise.
Now navigate to the client/src/ folder.
If you need to use the remote functions of the system, run in the terminal:
node start_remote.js
If you want to use the remote functionalities, run in the terminal (make sure you change the latitude and longitude with the desired coordinates):
node start_static.js <latitude, longitude>
SightPlusPlus-App
To use the app you can download the APK from here:
https://liveuclac-my.sharepoint.com/:u:/g/personal/zcabvud_ucl_ac_uk/EQxxIyK-tgZPnFGgm_Zo7zIBs09yk71X4JQooh_0qJq0sg?e=9JUZe1
Download it on your Android phone and install it. Now you can use the app. Make sure you download and start the Server and the Client systems first.
You can also download the code and run it on an emulator, or even emulate it on your own device. Now press the screen and
tell the system to START, STOP, change SPEED (e.g SPEED 0.6), VOLUME (e.g Volume 0.6) or PITCH (e.g PITCH 0.6), turn on/off
the VIBRATION (e.g VIBRATION on) or start the system for the static cameras (by saying “location”). Double-tap the screen to
warn that there was an error with the recognized object.
To work on the code, you need to install the Flutter SDK. You can do it from here:
https://flutter.dev/docs/get-started/install.
If you need to create a new APK, just type in the terminal (make sure you are in the root folder) flutter build apk.
NOTE: The client and the app are using Firebase. The credentials should be changed.
Thank you for your attention, Team 5
The MIT Licence for Sight++ APP
Copyright (c) 2021 Sight++ Team
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"),
to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,
and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
1. NodeJS public-ip; https://www.npmjs.com/package/public-ip; licence type: MIT
2. NodeJS websocket; https://www.npmjs.com/package/websocket; licence type: Apache License 2.0
3. NodeJS firebase-admin; https://www.npmjs.com/package/firebase-admin; licence type: Apache License 2.0
4. firebase; https://pub.dev/packages/firebase; licence type: MIT
5. firebase-core; https://pub.dev/packages/firebase_core; licence type: BSD
6. geolocatior; https://pub.dev/packages/geolocator; licence type: MIT
7. flutter_speech; https://pub.dev/packages/flutter_speech; licence type: BSD 2-Clause "Simplified" License
8. firebase-database; https://pub.dev/packages/firebase_database; licence type: BSD
9. vibration; https://pub.dev/packages/vibration; licence type: BSD
10. flutter_tts_improved; https://pub.dev/packages/flutter_tts_improved; licence type" MIT
11. flutter_tts; https://pub.dev/packages/flutter_tts; licence type: MIT
12. geocoder; https://pub.dev/packages/geocoder; licence type: MIT
13. http; https://pub.dev/packages/http; licence type: BSD
14. sqflite; https://pub.dev/packages/sqflite; licence type: MIT
15. intl; hhttps://pub.dev/packages/intl; licence type: BSD
The Sight++ Team has extensive experience to undertake the entire work of Intellectual Property matters of Industrial Design, Source Code of the Software, Associated documentation files (the "Software"), Copyright; copying, enforcement, maintenance, rendering advice and handling litigation are permitted to all Intellectual Property Rights.
1. Basic information about our team
We are the developer of this Sight++ APP, and this APP is designed for the visually-impaired people. In order to better optimize this APP in the future, we hope to collect some data from users legally and fairly.
2. Principles of processing personal data
3. Details about the collected data (What data and how data will be used)
Retrieved Sight++ data from mobile Intel Realsense Camera data, including the time, the location, the remote/static attribute and either true/false for an error parameter, will be collected for our devloper. The data will be used for fixing the bugs and devloping better Recognition Technology, so this APP can provide the better experience for visually-impaired users in the future.The software is an early proof of concept for development purposes and should not be used as-is in a live environment without further redevelopment and/or testing. No warranty is given and no real data or personally identifiable data should be stored. Usage and its liabilities are your own.
This is the link of our Development Blog: https://kingbladelee.com