IELS: Indoor Equipment Localization System: Conclusion (Part 6/6 final) (IoT)

Conclusion

We conclude there is a gap between a few of the mentioned research papers  and the commercial products. Most  researches are missing the practical usage of iBeacons and focus primarily on specific problems. However, all commercial solutions does not offer  free-of-cost solutions. We think there is a place for Open Source standard software for indoor localizing System. We therefore aim to focus on practical usage of iBeacons based on Indoor Localization System and in short term share our  knowledge from  this thesis with  potential contributors. We would like to continue working on this project and improve it.  All this we aim  to release it at some point as an Open Source Indoor Localization System standard software and make it possible to use  the  system as crowdsensing. That means people who  has  smartphones with the proper application and thereby be able to collect the location of the smartphone and as well as trackable objects location information periodically. This  should be send to a centralized backend infrastructure based on  end-user behavior and requirement.

The work from Bulten [1], Radboud University in Holland has been a major inspiration for our  thesis. Bulten focuses on practical usage of the iBeacons and has already released source code based on JavaScript.

We think the combination of iBeacons’ price together with long  life battery and easy-to-deploy makes it a perfect choice to use for indoor localization purpose.

Another aspect for future research would be the improvement of the algorithm’s efficiency, and overall improvement of the system as mentioned in discussion.

Finally, it should be concluded that the  environment surrounding iBeacons and smartphone hardware type and model have  a major impact in the final quality of the results regardless the technique.

Extra stuff

IT University of CopenhagenExperitment in 5th floor in ITU IELS longterm concept   360 panorama from PitLabIELS infographic

Bibliography

[1]  Wouter Bulten.   Human slam simultaneous  localisation and configuration (slac) of indoor wireless sensor networks and their user, 2015.

[2]  Song  Chai,  Renbo An, and Zhengzhong Du.   An indoor positioning algorithm using bluetooth low energy RSSI.

[3]  Yu-Chung Cheng, Yatin Chawathe, Anthony LaMarca, and John  Krumm. Accu- racy characterization for metropolitan-scale wi-fi localization.

[4]  Ramsey Faragher and Robert Harle.  An analysis  of the accuracy of bluetooth low energy for indoor positioning applications. In Proceedings of the 27th Inter- national Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS+ 2014), Tampa Florida, September 2014, pp. 2011-210., 2014.

[5]  Atul Gosai and Rushi Raval. Real time location based tracking using wifi signals.

[6]  Xiaofan Jiang, Chieh-Jan Mike Liang, Kaifei Chen, Ben Zhang, Jeff Hsu,  Jie Liu, Bin Cao,  and Feng  Zhao. Design and evaluation of a wireless magnetic-based proximity detection platform for indoor applications.

[7]  Philippe Bonne Jonathan Fürst, Kaifei Chen. Evaluating and improving blue- tooth low energy performance in the wild.

[8]  John  Krumm. Ubiquitous Computing Fundamentals. CRC Press, 2010.

[9]  Anthony LaMarca and Eyal de Lara.  Location systems: An introduction to the technology behind location awareness.  Synthesis Lectures  on Mobile and  Per- vasive  Computing, 3(1):1–122,  2008.

[10]  Jea-Gu   Lee,    Byung-Kwan  Kim,    Sung-Bong  Jang,    Seung-Ho  Yeon,   and Young Woong Ko. Accuracy enhancement of rssi-based distance estimation by applying gaussian filter. Indian Journal of Science and  Technology, 9(20), 2016.

[11]  Zhouchi Li, Yang Yang, and Kaveh  Pahlavan.  Using iBeacon for newborns lo- calization in hospitals. In 2016 10th  International Symposium on Medical In- formation and  Communication Technology (ISMICT).  IEEE, 2016.

[12]  Filip  Mazan and Alena  Kovarova.  A study of devising neural network based indoor localization using beacons: First results. In CISJ Vol19 No1 2015., 2015.

[13]  V. N. Padmanabhan P. Bahl.  Radar: An in-building rf-based user location and tracking system. In INFOCOM  2000. Nineteenth Annual Joint Conference of the IEEE Computer and  Communications Societies. Proceedings. IEEE, vol.  2, pp. 775-784., 2000.

[14]  Jeongyeup Paek, JeongGil Ko, and Hyungsik Shin.  A measurement study of BLE iBeacon and geometric adjustment scheme for indoor location-based mobile applications. Mobile Information Systems, 2016:1–13,  2016.

[15]  Piotr  Sapiezynski,  Arkadiusz Stopczynski,  Radu  Gatej,   and Sune Lehmann. Tracking human mobility using WiFi signals. Tracking Human Mobility Using WiFi Signals, 10(7):e0130824, 2015.

[16]  Takahiro Uchiya Ichi Takumi Shinsuke Kajioka,  Tomoya Mori and Hiroshi Mat- suo.  Experiment of indoor position presumption based on rssi of bluetooth le beacon. In IEEE 3rd Global  Conference on Consumer Electronics (GCCE) 2014, 7-10 Oct. 2014., 2014.

[17]  H. Jiang Z. Chen, Q. Zhu  and Y. C. Soh.  Indoor localization using smartphone sensors and ibeacons.  In IEEE 10th  Conference on Industrial Electronics and Applications (ICIEA), Auckland, 2015, pp. 1723-1728., 2015.

[18]  Faheem Zafari and Ioannis Papapanagiotou. Enhancing iBeacon based micro- location with particle filtering.

[19]  Luo Haiyong Zhu Jianyong, Chen Zili and Li Zhaohui. Rssi based bluetooth low energy indoor positioning.  In IEEE 2014 International Conference for Indoor Positioning and  Indoor Navigation (IPIN), 2014.

IELS: Indoor Equipment Localization System: Discussion (Part 5/6) (IoT)

Discussion

When this project started, the idea was simple. The goal was to localize objects inside a building environment in an easy and affordable way.

This  idea  turned out to  be  a wide  research area, and lead  to  try different approaches as explained in this chapter.

We think adding walking steps, compass and gyros  sensor to predict moving direction would help creating extra  virtual position. This would add extra  position for our calculation process. We did few experiment in this direction but decided to leave this for future development.

In addition, concepts like fingerprinting and dead reckoning will add value to the system, both should be considered for future improvement.

We have  at  some point considered machine learning for clustering prediction. We think this can be an interesting area of research, for its own.

In our experiment the  user  hold the smartphone in his hand, but  how about collecting data while having the  smartphone in a pocket, at leg level or on a table etc. This could be interesting to clarify,  as it is our  ultimate goal to predict objects while having our smartphones in a pocket.

We thought about adding extra  smartphones in a fixed position that collects data permanently to improve stability of results.

Another interesting aspect could be to try the  same approach with  less iBeacons and try it with directional iBeacons to cover  specific regions. We have  tried to work with eight iBeacons, but we observed that our  software fails to return results.  An- other approach is to  try with  only  four  iBeacons at  each corner of the  room and predict region position estimation instead of x and y position.  We have  at the end chosen to focus on our existing setup with 14 iBeacons and decided to leave this for further research.

Power consumption is important to bring up,  this means a lot for end-users. We do not want to let our  app drain battery from  a user’s smartphone.  Therefore, it is ideally to collect data periodically. For instance, we collect data once only  when a user or the smartphone is stable and not in a movement. We recollect data again when a user or the smartphone change position status. First of all that would reduce noise, improve accuracy and improve battery operation time.

We thought in the long term, that if our app should cover  big buildings, we could easily combining iBeacon data and Wi-Fi data to cover  most of the building without necessarily deploying iBeacon all over the building.

It is true that our  algorithm is tested, works  and deliver the  results we present here in this  report, but  we would like to mention that, it has a place for further improvement. We will mention some of the improvements, that we observed with our algorithm. While we developed the algorithm we did not think of efficiency and running time. This might be possible to improve. The sorting process of iBeacons strongest signal does not guarantee that the strongest signal is the  nearest iBeacon. This  has  therefore reduced the overall accuracy results.  This is left for future improvement.

An interesting test would be with the use of real evaluation with real end-users in a real environment out of the IT-University. Especially the input from  the end-users and the business owners would be a valuable information to have.

It is our  ultimate goal to use  the  smartphone as a crowdsensing device, but nevertheless, this  will rise a new  question about privacy issues. For example: will end-users accept using their smartphones to collect his/her position in which we would be able to use it for localizing trackable objects? Is there a mechanism to use crowdsensing for prediction object location without keeping the data from  a smartphone? These questions need to be studied and clarified.

IELS: Indoor Equipment Localization System: Evaluation (Part 4/6) (IoT)

Evaluation

Evaluation is important and a key process in a project completion. This evaluation is to present the  results of our system, for locating the trackable object position and track its (movement) repositioning over  time using mobile smartphones as  data collectors.  The  usability of the system relies  on  the results collected from  mobile smartphone over  time, more collection from  more users means better results and improving of accuracy. Accuracy in this context means predicting the position of the mobile smartphone in testbed 4.1 (PitLAB) as good  as possible at each direction as demonstrate in this  picture 4.8. Hence this is an experimental use case,  the testbed (PitLAB) environment we use  does include everything from  people sitting around, obstacles and furniture’s etc.  as shown in panoramic image 4.2.

Note: The pictures might have  changed during the  project period, since the PitLAB was under construction during this study.

lab path tagged objectFigure 4.1: Testbed at PitLAB, the yellow  circles indicate three trackable objects. In this picture we  want to  demonstrate how  things look  in  PitLAB and some of the trackable object’s true positions. In our  experiment we use  only  one  trackable object.

4.1    Experimental Setup

The  setup of the  system, means having 14 iBeacons mounted with fixed positions of X and Y  axis (all distances are demonstrated in figure  4.4) in relatively equal distance distribution of each axis.  The height of the iBeacons from  the floor is around

2100-2300mm this is hence to room structure and furniture 4.2.

The  idea  is that the  user has  a smartphone with our  Mobile Client application running on it. The user lets the smartphone collects data form the surrounding iBeacons like the  RSSI-values and UUID/MAC address.  The system then uses  RSSI data to predict the position of the smartphone in the room.

pitlab panoramaFigure 4.2: PitLAB Panoramic view, a scale image version of this is in appendix chapter. If we zoom in the  image we can  see  the  nature of PitLAB. People, furniture in different heights, work-space and desks etc.

4.1.1    Trackable object

We use the same iBeacon type for our trackable object, the only different is our trackable  object does not  have  a fixed  position.  The  height of the  iBeacon is 1200mm mounted on an retried flag pole  4.3.

Trackable objectFigure 4.3: Trackable object (Pole)  tag with  iBeacon. We asked the  Facility Management department of the IT-University if they could provide a pole  for testing, and luckily they had some old flag poles which have  been used throughout the study.

pitlab overviewFigure 4.4: iBeacon fixed position in room. Here we show the  PitLAB room top  view and how 14 iBeacons with fixed positions are placed in the room

4.1.2    iBeacon specification

All our  iBeacons are  Estimote 4.5 brand. The two most important configuration of iBeacons is transmitting power (TxPower) and advertising period. TxPower is set to

-12dB and advertising is set to 967ms on all iBeacons.

ibeacons estimote

Figure 4.5: Estimote iBeacons

4.1.3    Smartphone

The smartphone is from  Google, model LG Nexus 5x. Regarding to recent research made by Fürst [7], the  IT-University, this smartphone  should be the best phone to receive iBeacon signals. It makes it a suitable choice for our project.

Nexus 5xFigure 4.6: Google Nexus 5X

4.2    Experiment

We mark the  floor  with four  positions, called true positions: A, B, C and D. At each true position we have four directions: 0, 90, 180 and 270 degrees, 0 is almost on Earth North direction, as demonstrated in picture 4.8.

We collect data from  all iBeacons (fixed  position iBeacons and trackable object iBeacon) from  each direction over  60 seconds of time. We hold the smartphone in our  hand, centered to the body front as demonstrated in the picture 4.8. We repeat this in four rounds.

For instance if the trackable object is placed on C, we collect data from  true position A Round 1 and true position B Round 2 from all directions. We continue to move the trackable object to D, and collect from true position B Round 1 and true position C Round 2 until trackable object has been through all true positions as shown in the experiment table 4.7.

experiment overviewFigure 4.7: This table presents rounds and the time sequence. Round 1 and Round 2 are parallel rounds, but  if we predict smartphone at A and predict smartphone at B, we predict trackable object at C. Later in this section eight different graphs will be presented (Graph1 to Graph8). These represent the results of the experiments.

testbed anglesFigure 4.8: A person holding smartphone testing RSSI receiving signals at Round 3 in true position B on angle 180° direction.

pitlab testbedFigure 4.9:  Testbed at PitLAB. We show here for instance Round 3, where the  user test smartphone at true position A and move to true position B to predict trackable object at true position C

4.3    Results and analysis

In experiments from  Round 1 and Round 2, we have totally 19 trackable objects estimation results and Round 3 and Round 4 from 23 trackable object estimation results. The  Red  mark belongs to true position C, Green belongs to true position D, Blue belongs to true position A and Orange belongs to true position B. For each colour cluster, we calculate a centroid of the collect data and represent this in bigger circle. We have  added a horizontal and vertical line in centre of our  graph in which each true position gets its own region respectively region AR, BR, CR and DR. The cross (+ sign)  in graph represent the true position of the trackable object as demonstrate in graph 4.10 and graph 4.11.

As it appears for Round 1 and 2, the position of Red centroid belongs to CR region, Green belong to DR, Blue belongs to AR, expect the last one  Orange which belongs to AR. However, Round 3 and 4, all colour belong to  their respective regions (we calculate our centroid by taking the average of x and y for each cluster).

Graph 1 4.10 shows the  result from  Round 1 and 2 and Graph 2 4.11 show the result of round 3 and 4.

Figure 4.10:  Round 1 and 2, (+) sign  represent true positions of trackable object, small dot  in graph represent trackable object estimation results over  time and the bigger circle  is a centroid product of the small dot cluster

Figure 4.11:  Round 3 and 4, (+) sign  represent true positions of trackable object, small dot  in graph represent trackable object estimation results over time and the bigger circle  is a centroid product of the small dot cluster

As explained in the beginning of this section, we predict a smartphone position at different angles in different true positions. For instance, we predict a smartphone at true position A and the same in true position B, then we use the results to calculate and predict the position of trackable object in C etc.

The  results from  the experiments are  presented in the  following eight different graphs. Each graph has two centroids of two predicted smartphone positions of four different directions with  cross (X) sign,  smartphones have  true positions with plus (+) sign,  small dots are generated by our  trilateration method from  combination of two predicted smartphone on different direction and a big circle  that represent the centroid of all small dots for trackable object.

Before  we start with  looking into the  eight graphs, we present a graph as an  example 4.12 and explain the signs:

+ sign: True  position
X sign:  Centroid of Smartphone predicted position
O sign:  Centroid of trackable object predicted position o sign:  Predicted positions of trackable object
A position top left
B position bottom left
C position bottom right
D position top right

Figure 4.12: Graph: This is a sample graph with explanation of different signs we use in the upcoming graphs.

Figure 4.13: Graph1: Collecting data from two true positions A(+) and B(+). A(X) and B(X) smartphone predicted position, B(X) is far away from  B(+), result of this predict position of trackable object centroid C(O) with true position of C(+)

Figure 4.14: Graph2: Collecting data from two true positions B(+) and C(+). B(X) and C(X) smartphone predicted position, result of this predicted position of trackable object centroid D(O) with true position of D(+)

Figure 4.15:  Graph3: Collecting data from  two true position C(+) and D(+).   C(X) and D(X) smartphone predicted position, C(X) is far away  from  C(+), result of this predicted position of trackable object centroid A(O) with true position of A(+)

Figure 4.16:  Graph4: Collecting data from  two true positions D(+) and A(+).  D(X) and A(X) smartphone predicted position, result of this  predicted position of trackable  object centroid A(O) with true position of B(+), Our algorithm was only able  to predict one  position

Figure 4.17:  Graph5: Collecting data from  two true positions A(+) and B(+).  A(X) and B(X) smartphone predicted position, B(X) is far a way from  B(+), result of this predicted position of trackable object centroid C(O) with true position of C(+)

Figure 4.18:  Graph6: Collecting data from  two true positions B(+) and C(+).  B(X) and C(X) smartphone predicted position, C(X) is far away  from  C(+), result of this predicted position of trackable object centroid D(O) with true position of D(+)

Figure 4.19: Graph7: Collecting data from two true positions C(+) and D(+). C(X) and D(X) smartphone predicted position, result of this  predicted position of trackable object centroid A(O) with true position of A(+)

Figure 4.20: Graph8: Collecting data from two true positions D(+) and A(+). D(X) and A(X) smartphone predicted position, result of this predicted position of trackable object centroid B(O) with true position of B(+)

Since  we have  an average error level of 2.04 meter 4.4 with standard deviation of 0.28 meter 4.4, so if we want to visualize trackable object position on a map it would be much better to present it in a more human friendly and readable way.

In our  map, we have  four  true positions. If we split  the room horizontally and vertically in half, this would give use four regions, where each true position belongs to its respective region as described earlier.

The  same convention is used as before AR, BR, CR and DR. AR stays  for A Region,  B Region, etc.  If we take the trackable object position result from  the  previous graphs, then we can present the trackable object position in its region and its movement over time. As we can see in the first graph the trackable object has moved over time from  C to D, D to A and A to B is its final move.

In the  first graph, we see trackable object move over time on the respective path, expect A to B. If we look back in Graph4 4.16, we can see we have only one dot, where we normally should have  at least more than one  dot  as shown in graph 4.21.  Our system has  failed  to calculate the  results of the  particular place. The reason for this can vary, but  one  reason might be that the  particular time there could be affecting heavily with noise.

If we take  the  next  graph, the  trackable object follows  the path as expected as shown in graph 4.22.

Depending on the room size and the  error level, we assume it is possible to split the room in more regions to get smoother results. That  said,  we conclude that continues data collecting from  more phones over time will improve accuracy of the prediction of position results.

Figure 4.21:  In this  graph we illustrate trackable object location in regions and its movement from  place to place over  time. This is a result of combination of Round 1 and 2. As we can  see the trackable object suppose to move to B region, since our system fails to calculate the results. This means trackable objects stop at A region

Figure 4.22:  In this  graph we illustrate trackable object location in regions and its movement from  place to place over time. This is a result of combination of Round 3 and 4.

4.4    Error level

We have  in chapter 2 under section RSSI measurement 2.2.2 talked about iBeacon and smartphone challenges and in same chapter talked about RSSI distance calculation 2.8 which shows the measuring errors over distance of our iBeacon. If we take all that into consideration and look at our results we discover an error margin in our results as well.

We have  made two  graphs to  illustrate the  error level  of our  system.  The  first graph demonstrates error level  of a smartphone.  We present the results of each Round and for each true position point in table. Then we calculate the hypotenuse of X and Y  to get the distance of the  error and present that in Cumulative distribution function (CDF) graph 4.24. We can  see that our overall results have  distribution of error at different levels but interesting enough that almost 75% of the results have error level below 1.5 meter with  average of 1.33 meter error and standard deviation of 0.13 meter.

However, in the Cumulative distribution function (CDF) graph 4.26, for trackable object, we can see the error average has raised to 2.04 meter with standard deviation of 0.28 meter. The interesting aspect of this graph is that 75% of the results are below 2 meter.

If we compare the error of trackable object to our  smartphone, we see the  trackable  object error level is higher than the one  from  smartphones. This is due t nature results, hence the trackable object get its position result from  smartphone positions which already have  error margin.

We calculate the hypotenuse with the following formula:d = error distance from  true position.

Figure 4.23:  Smartphone distance error of each axis x and y for each true position and for each round, with average value  of each round.

Figure 4.24: CDF graph for smartphone distance error of hypotenuse results from  x and y from  previous table for each round

Figure 4.25: Trackable object distance error of each axis x and y for each true position and for each combined rounds, with average value  of each round.

Figure 4.26: CDF graph for trackable object error of hypotenuse results from  x and y from  previous table for each combined rounds.

4.5    Conclusion of this chapter

We conclude that BLE signals in iBeacon is hard to control. Even Estimote (the company that produces iBeacon, which we use in our  experiment) mention the issue of precision on their website 1. It is around 20-30%.  This said,  we cannot control the behaviour of BLE nature. In addition, we have learned that we need to predict a high number of smartphone position before we can  predict our  trackable object. However,  by developing and improving algorithms, we will be  able  to get some useful results.

IELS: Indoor Equipment Localization System: Implementation (Part 3/6) (IoT)

Implementation

3.1    Software architecture

The software architecture is based on three main components, Mobile Client, Data Server  and Admin Server  as shown in figure  3.1.

  1. Mobile Client is an Android application. Its main purpose is to collect iBeacon data such as RSSI-values and send it over to the data Server.
  2. Data Server is a Restful API (Application Programming Interface) that takes care  of the  data collected by Mobile Client application.  Data Server  is also responsible for business logic and calculation algorithm. Data Server  have  a UI, to show RSSI results of experiments.
  3. Admin Server is a Web Application with Restful  API offer an administrator UI help to manage the iBeacons and smartphones information.

The  idea  of separating Data Server  and Admin Server  relies  on  future thinking. Hence Data Server  purpose was  only  experimental and data analysis. We thought at some point the  Data Server  will end its job and the  logic of the algorithm will be implemented directly in the smartphones. That  way the  smartphone will take care of all logic and returns its x and y position directly to Admin Server.

ImplementationFigure 3.1: System Architecture. We show how  Mobile Client connect once at startup of the application to Admin Server to get a list of all allowed devices and iBeacons. When the Mobile Client is done with collecting of iBeacon data, it posts our  data to Data Server. When we start Data Server it connects to Admin Server to get a list of all allowed devices and iBeacon.

3.1.1    Mobile Client

The  main purpose of Mobile Client Android application  3.2 is to  collect iBeacon RSSI data.  When the  application starts up,  Wi-Fi,  Bluetooth and Location service turn on,  because it is required, otherwise it will alert the  user  3.3.  This  is required to update initial iBeacons (iBeacon fixed position, name of device etc.),  data from Admin Server, sending and receiving data over  Wi-Fi, receiving iBeacon data using Bluetooth. When a user  enters experiment name and start collecting process, it will check for six conditions before collecting iBeacon data as described here and shown in process flow 3.5:

  1. Data-structure: the application will connect to the Admin Server  to build a data structure of activated iBeacons with initial information such as fixed  X and Y position of iBeacons, names of iBeacon, name of trackable object iBeacon  and smartphones name if registered in system.
  2. Registered: it checks if a smartphone device is registered and is allowed by the Admin Server.
  3. Bluetooth: it checks if Bluetooth is receiving data from  iBeacons as expected.
  4. Connected: it checks for Restful  API connectivity.
  5. Collecting: smartphone starts collecting real-time data over 60 second.
  6. Send Data: when 60 seconds is over, it posts the collected data to Data Server.

Android Client
Figure 3.2: Preparing page.

Android Client ErrorFigure 3.3: Alert page

Figure 3.4: Mobile Client Android application. When the application is ready to collect  data for each experiment, the  Start-button appears automatically to the user. When the collecting time is over  it will automatically post the data and give a pipsound at the end. In case  one  of the communication service like Wi-Fi, Bluetooth or Location Server  is down, the user  gets alert.

Android app flowFigure 3.5: Mobile Client application process, shows how the application react while starting up until the user is ended with the experiment.

3.1.2    Data Server

Data Server is a Restful API. It receives iBeacons data from Mobile Client Application and store it on  real-time database from  RethinkDb 3.8 (an  Open Source real-time database). Data Server  stands for business logic and calculation algorithm. In addition, it offers  a UI 3.6 to show graph of RSSI-results 3.7 for analysis purpose. The server works  on a local machine in PitLAB.

data server uiFigure 3.6:  Data Server  UI. We built this  system to deliver the tools  we needed to make our  experiment process smoother. It delivered data to Excel or showed RSSI- values of given experiment

RSSI samplesFigure 3.7: RSSI samples. This is example of Data Server  UI that shows RSSI-values of given experiment

realtime rethink databaseFigure 3.8: Real-time Rethink Database administration UI. In our  development we used Rethink Database, it has  own  API that is compatible with programming language such as Java, C#, PHP and many other programming language. This  means it deliverers us a simple ReQL (RethinkDB Query Language) which we can  use  on different platforms

3.1.3    Admin Server

Admin  Server  is a cloud based Web  Application  which delivers an  administrator UI 3.9 to manage all devices and iBeacons allowed to interact with  the  system. In addition, it offers  all initial information about iBeacons, trackable object and Mobile Client.

administrator uiFigure 3.9: Administrator UI. This is Admin Server  UI which helped us adding new iBeacons or smartphones to our experiment.

3.2    Development challenges

In previous chapter under different smartphones challenge section 2.2.2.1, it is described how we had challenges with developing a Mobile Client.

We developed our  Mobile Client on  Google Nexus 5X smartphone with  single iBeacon and it collected iBeacon data as expected.  When we tested the same application on  the smartphone we  got  from  PitLAB (Motorola Moto E3) the we  got different results. First,  we thought our  software had issues and we tried to re-write and improved our code, but we ended up understanding that it was a hardware issue rather than software.

Throughout the project, we had minor network issues in PitLAB. Unfortunately, we were  not  able  to start our  server with the given  IP address. Luckily, the management of PitLAB offered us access to an internal Wi-Fi system that was reliable and stable.

3.3    Conclusion of this chapter

In this chapter it has  been explained how  our  software infrastructure was  implemented. At an early stage of development process of this project, the main idea  was simple: how to collect iBeacon data and process it? This idea  turned soon to require us to develop three main software solution: a Mobile Client, a Data Server and an Admin Server. All of them intercommunicating with each other. We have  learned that developing software for iBeacon for one smartphone does not necessarily means it works  for all smartphones.

How to Create virtual COM port pairs in Windows for development (IoT)

Just imagine you are developing embedded device or IoT device solution where you need to interact your software with a hardware device over Com port.

Normally you need to have your real device turn on and connect it to your development PC before you can interact with it. The challenge is that you might developing your software and the embedded prototype is not ready yet, you are waiting for it, you need to testing it locally before connecting to a real device or just to emulate the data communication before developing embedded prototype. Then your only way is to create a virtual Com port.

In this article I am going to use Virtual Serial Port Driver 9.0 from Eltima’s software.

I will demonstrate how to use Virtual Serial Port Driver 9.0 using two different examples:

  1. Chat example based on C#, using Microsoft System.IO.Ports example.
  2. Using C# to emulate embedded device that sending sensor data using and receive the data.

Before we dig into examples, let download Virtual Serial Port 9.0 standard edition from https://www.eltima.com/products/vspdxp.

Install the software and start it.

Click on Add Pair

And Ya, That’s it, now you have 2 com virtual ports COM1 and COM2.

I found Virtual Serial Port Driver 9.0 one of the easiest and most stable virtual serial port software on market. It allows me to develop and test my code before spending hours on a real device testing.

Example 1

I have visited Microsoft SerialPort Class  (Sytem.IO.Ports) page and found a chat example. I have created visual studio solution that contains all example of this article on my git ComPort. Clone it and fire up your Visual Studio with the solution twice (I do used VS2017) and yes, I mean start Visual Studio two times because we need to have 2 chats to connect to each other.

Make sure your Chat project is set as Start-up project on both VS.

Visual Studio Chat

Start the program (ctrl + F5) on your first VS and your console will start asking you questions, just press ENTER (leave empty) until you reach the name, just give any name, I call my first chatter Joe as you can see in the image below.

Virtual Serial Port Driver 9

Start your second console program, this time you need to tell the program that you are using COM2 as shown in the image, and tap ENTER until the name and just give it any name, I call my second chatter Doe as you can see in the image below.

Virtual Serial Port Driver 9

Now Joe and Doe can chat between each other.

Virtual Serial Port Driver 9Virtual Serial Port Driver 9

Just imagine if this virtual communication was made with a real device that can receive messages from your system or the other way around, as we gone demonstrate on the next example.

Example 2

In this example, I am going to use one port to sending some random sensor data, and I will fetch the data on the other port.

This example is based on real scenario that I have used previous in one of my research and development projects. I was developing a hand gesture solution and my only prototype was not available, So I designed a software that emulate the data and send it as it was the hand gesture, but over my virtual com port.

Now in stead of have Chat as start-up project, set your first Visual Studio start-up project to Emulator and the second one to DataReceiver.

Visual Studio 2017

Now start-up your Emulator program (ctrl + F5) nothing should happen.

On your second Visual Studio start DataReceiver program and you should see that Emulator program is sending x, y, z data and it being collected by DataReceiver program.

Virtual Serial Port Driver 9

You can also see the amount of data being sent and recevied by ports.

Virtual Serial Port Driver 9

Now you should be able to parse your received data and make your programming logic with out having a real device.

Links: