TETRA Radios vs GSM and Wi-Fi – What is the Correct Choice?

Today, different wireless communications technologies such as GSM and Wi-Fi are expanding and improving at unbelievable speeds, and it would be valid to raise doubts on the future of TETRA technology. This is especially important when we acknowledge the increasing trend of the global unification of different standards that has caused technologies such as WiMAX to be simply taken out of competition.

What is TETRA technology?

TETRA (TErrestrial Trunked RAdio) is a set of standards developed by the European Telecommunications Standardization Institute (ETSI) that describes a common mobile radio communications infrastructure. It is the next-generation replacement of the old analog mobile and handheld radios used by public services as well as many industries such as construction or oil and gas.

Unlike its analog ancestors, TETRA was built over long years of research as a digital standard to collectively provide the features of older technologies such as mobile radio, cellular telephones, pagers, and wireless data.

TETRA networks are now implemented in well over 100 countries in Europe, Middle East, and Asia.

Benefits of TETRA technology vs Analog Radios

The benefits of TETRA technology when compared to preexisted analog radios are enormous:

  1. All communications are digital and encrypted.
  2. All modes of one-to-one, one-to-many and many-to-many communications are available.
  3. Data transfer on the same network is possible
  4. Calls can be seamlessly “relayed” between the mobile stations enabling communications over very broad geographical areas.

TETRA technology vs GSM

It is obvious that the growth of cellular phone networks such as GSM have limited the TETRA market in many ways. Unlike before, where many private companies were highly dependent on using analog emergency radios for their communications, mobile phones have obviously taken over this market, limiting the usage of TETRA radios to mainly the public and emergency services.

The reasons that the public and emergency services still depend on TETRA technology are:

  1. Instant and easy one-to-many calls which is critical for emergency situations
  2. The much lower frequency used (380MHz vs 800/900MHz) gives longer range, which in turn permits very high levels of geographic coverage
  3. In the absence of a network, mobile radios can use ‘direct mode’ to share channels directly (walkie-talkie mode)
  4. Encrypted communications

TETRA technology vs Wi-Fi

Most of benefits mentioned above are also valid for a Wi-Fi based network, however due to much higher frequency of a Wi-Fi network (2.4 or 5 GHz), the very short range of a few hundred feet makes Wi-Fi not an option to consider for public and emergency services.

Future of TETRA Technology

While I predict TETRA radios will still be widely used in public and emergency services around the world for years from now due to the benefits listed above, and considering the enormous investments made in building up the TETRA infrastructures, I would predict no future for the technology in the long run.

The future of communication infrastructure is clearly in the expansion of a unified data infrastructure based on GSM technology and software based solutions.

TETRA radios only support a maximum data rate of 3.5kbps and need huge budgets for both setup and maintaining the infrastructure. It would not take much long before we see similar ruggedized, easy-to-use handheld radios that provide identical functionalities of current TETRA radios and much more, but operating on dual Wi-Fi and GSM infrastructure and at a fraction of the costs. Intelligent drones can also act as quick range extenders for cases of emergencies or natural disasters.

The question is not if this will happen, but only how many years (or maybe months) from now it will happen.

AI – Risks and Benefits

Only a few days ago, it was announced that FacebookAmazonGoogleIBM and Microsoft have come together to create the Partnership on AI. Almost at the same time, Microsoft announced expanding its AI efforts with creation of new Microsoft AI research group consisting of 5000 people worldwide and Amazon offered a $2.5M Alexa prize for an AI that can chitchat for 20 minutes!

Putting these side by side to other emerging technologies such as Big Data, Cloud Computing, IOT and 3D printers, it would remind many of us of the “Terminator” or even “Matrix” movies and the possibilities of what is named as “AI takeover” (machines defeating and taking over humans).

While there are many hot discussions these days about the moral and ethical aspects of AI as well as its risks and dangers, in this Article I would try to share some ideas from an engineer’s point of view.

What is AI (Artificial Intelligence)?

In the shortest definition, Artificial intelligence (AI) is intelligence exhibited by machines. Some of main goals in AI are natural language processing, learning and self-evolving, reasoning, and planning.

While AI has been a hot topic since the early days of inventing computers and the term was mentioned as early as 1950s.

However with the increased computer processing power especially through Cloud computing, access to large amounts of data (Big Data), and advances in programming techniques, AI is now back on the scenes stronger than ever, and with daily fascinating achievements.

However, at the end they are sophisticated software applications developed by humans to do tasks faster and with less error.

Pictured below is Google’s self-driving car using artificial intelligence.

Pictured below is Sony’s Artificial Intelligence Robot (AIBO) robotic pets.

RoboCup is the leading and largest competition for intelligent robots, and one of the world’s most important technology events in research and training for artificial intelligence.

Risks and Dangers

The core reasoning for fears and doubts spread around AI can be summarized as below:

  • By giving self-learning, reasoning, and self-evolving capabilities to machines, who can guarantee that these machines will not reach conclusions not initially predicted by their designers resulting into fatal actions?
  • With the integration of AI with technologies such as IOT and 3D printers, the impact of AI software is not limited to software world anymore – such AI software solutions can actually make physical impacts and take control in real world.
  • Who can guarantee what hackers or bad guys can do with such complicated AI technology?

Benefits of AI

On the other hand, supporters of AI list all the unbelievable opportunities and benefits that can be provided by enhancement of AI:

  • Unbelievable acceleration of calculation and analysis of huge amounts of data, helping us to make faster and more accurate decisions.
  • Saving human lives by preventing human errors like the case of self-driving cars preventing accidents.
  • Doing the consistent, repetitive tasks in mass quantities without getting tired or being affected by personal emotions.
  • Enabling a much easier and human-like interaction with computers and access to information.

Conclusion

Like any other technology, AI can serve or harm humans based on how it is utilized. The more sophisticated the technology, the more it can impact our lives positively or negatively.

Hence there is a very good reason for why Partnership on AI is so important and necessary and just on right time: if AI is not quickly regulated with the needed frameworks and safety procedures, we should all fear about the outcomes of AI rapid advancements. Such rare partnership formations once embraced by large population of interested parties, would on the other hand provide the needed means to ensure AI would be secured as a new instrumental and very effective tool for improving human life and nothing to fear of.

6 Reasons Why CCTV Security Systems are Vital for Your Business

Do we pay the price of keeping our facilities safe and secure? Do we have a good ROI when we spend our hard earned cash on CCTV monitoring of our facilities? How crucial is it to have our premises covered by a well-designed professionally installed security system?

One of the most cost effective ways to provide security in the workplace is with CCTV Security or Video Surveillance Systems. The highly advanced technology of modern security cameras allow businesses to lower cost and risk by protecting their assets with continuous and seamless monitoring of their facilities. These relatively inexpensive cameras have in the most part replaced expensive security guards while increasing the reliability and accountability to near 100% by providing real time remote video surveillance.

Here are six reasons why CCTV systems are crucial and extremely necessary for businesses today:

  1. Reduce Cost and Risk

CCTV security systems prove themselves as the best investment as soon as they are installed. Full view of your premises and real-time recording as well as remote online access by owner/manager lowers the risk and prevents costly incidents such as burglary, fire, vandalism, etc.

  1. Prevent and Deter Crime

Criminals target buildings and facilities when they see there is no monitoring or watchdog. The very presence of CCTV cameras on an installation is enough to deter potential criminals and prevent their action at the outset. Similarly, by the sight of a safety hazard, employees can take the necessary action to reduce the risk and alleviate the source in the shortest possible times.

  1. Fool-Proof Coverage

A well-designed CCTV infrastructure is practically impenetrable and can provide 100% coverage. Multiple cameras keeping an eye on each other’s blind spots in addition to keeping an eye on the most vulnerable areas of the property, full monitoring is not hampered by human errors such as the guard being on shut-eye or being busy with something else. Remote online access as well as recording of the events documents everything should it be needed by the police or judge.

  1. Keep Your Employees Honest

CCTV cameras on the outside prevent break-ins by outsiders, but if installed inside the facility to monitor sensitive materiel, goods, and assets, as well as vulnerable areas of a facility, it will prevent wrong doing by company employees. Even if something happens, the recorded video will prevent wrong accusation and lack of trust among all staff.

  1. Encourage Good Behavior

CCTV cameras help in creating discipline among employees and customers alike.  CCTV cameras encourage employees and buyers to be on their best behavior alike. It also gives customers a sense of security and safety as they know they are protected and gives them confidence in doing business with you.

  1. Prevent Safety Incidents

CCTV cameras can be installed in high-risk areas of a business facility or establishment. These high-risk and accident-prone areas include locations where fires can break out as well as locations where a potential danger to the building and personnel exist. Properly selected cameras can prevent potential damage because emergency measures can be taken immediately with careful monitoring.

  1. Assist Law Enforcement

CCTV recording of the scene of the crime, allow law enforcement agencies to use the footage and release photos and videos of the culprits to the public. A picture or a video record of the suspect can make a huge difference when it comes to making an arrest and getting dangerous criminals off the street.

So there you have it! We hope that we have enlightened you on the importance of CCTV Security Systems for your business. If you wish to know more about CCTV systems, check out our article on How to Select the Correct CCTV Camera to Use?

3D Printing and its Future – Brief Explanation

One of the technologies that is revolving our world in an unbelievably rapid pace is 3D printing.

But what exactly is 3D printing? Very simply put, 3D printing (also known as “additive manufacturing”) turns digital 3D models into solid objects by building them up in layers.

For over 2 decades since its invention back in 1980s, 3D printing was a very expensive and fascinating technology almost solely used for rapid prototyping by large enterprises who could afford the very high costs. This is not the case today, and it seems 3D printing is now going to play an important role in the ongoing technological revolution.

In this article, I’ve tried to explore this very briefly.

Categorizing 3D Printers

The variety of 3D printers and the technologies they use, as well as their usage are expanding almost on a daily basis. I see the pace very much the same as what has happened with computers: starting from mainframe computers in 1960s and 70s to what we know today as “IOT” (Internet of Things) where you would find an embedded computer in almost every device and appliance you use in your daily life.

From an engineer’s point of view, I would prefer to “categorize” this variety of 3D printers to better formulate the topic:

Size: 3D printers can be categorized based on size – which is obviously proportional to the size of the product they are designed to produce. We now have microscale and nanoscale 3D printers that are used to create extremely precise nanoscale-size objects with resolutions as low as 10 nm! On the other side of the scale, we have the giant-sized 37 meter long printer that can print a 2700 square-foot office in 17 days!

Above is a 3D office interior of the world’s first 3D printed office building in Dubai.

  1. Material: 3D printers can also be categorized based on the type of material they build their objects from. Apart from all kinds of plastics, there are now printers than can create objects made of metal, concrete, paper, or even living human cells!
  1. Manufacturing Process: Not all 3D printers use the same technology for their manufacturing process. There are tens of different technologies in use and new manufacturing technologies are being invented every day. The 7 main current categories for 3D manufacturing process are:
    1. Material Jetting
    2. Binder Jetting
    3. Material Extrusion
    4. Powder Bed Fusion
    5. Vat Photopolymerisation
    6. Sheet Lamination
    7. Directed Energy Deposition

This is a Material Extrusion 3D Printer, and a table that it has printed out:

3DMonstr photopolymerization 3D printer and its photocentric 3D print:

Fractal design metal sculpture printed via powder bed fusion.

Safety helmets printed on Stratasys polyjet material jetting hardware:

Color output from a 3D Systems Projet 4500, and an ExOne sand cast pattern.

Desktop direct metal 3D printing using Realizer SLM 50

Fruit and bowl 3D printed in paper using a MCor sheet lamination 3D printer:

Uses for 3D Printing

In the past few years, the range of applications for 3D printing is expanding at an astonishing rate. 3D printers are now so affordable that I would predict as early as 2020, they would become as common as ordinary printers and you can find one in any average home.

Some of the main usages of 3D Printers are:

  1. Rapid Prototypes: While this has been the main usage of 3D printers since their invention, as the technologies enhance and the prices are cut, they would enable individual inventors and engineers to quickly turn their ideas into reality without depending on investors and lengthy processes.
  2. Printing Molds: 3D printers incredibly minimize the time and the costs of creating the molds and other tooling which is a critical part of the manufacturing process and mass production. This would also enable manufacturers to produce less quantities of their product while keeping the costs reasonable.
  3. Direct Digital Manufacturing of Products (DDM): 3D printers enable direct manufacturing of products by the printer. This means millions of new business possibilities such as manufacturing all kind of customized products tailored to client’s specific demands.
  4. Medical: 3D printers are creating a real revolution in medical industry – doctors and surgeons can now print all kind of human body parts to the specifics of the patients, being it from artificial material or different proteins or even from living cells. 3D printers are also used for creating tiny medical robots or highly dissolvable medicine pills.
  5. Construction: maybe the most recent example is Dubai’s “Office of Future” where customized one-story, 2700 square-foot offices along with interior furniture are built in only 17 days at a fraction of costs and waste of material.
  6. Automotive and Aerospace: 3D printers enable building lighter, more complicated parts for vehicles and planes – they are so important for the future automotive and aerospace industry that GE just acquired to 3D printer pioneers a few days ago!
  7. Art: 3D printers open unlimited new possibilities for creative arts.
  8. Personal: With prices of 3D printers going as low as $250, they are now increasingly becoming a must have device for many hobbyists and enthusiasts who are using the 3D printers to “print” all kinds of pre-designed 3D objects or to turn their creativity and imagination into objects.

Future of 3D Printing

In my opinion, in a few years, 3D printers would become so common that one would barely remember how the world was before them! (Just as we now feel about computers or internet or smartphones). You would find them in every repair shop, every dental clinic, every surgery room, every construction site, and every home! Maybe in not so distant future, we won’t see many products in pre-manufactured forms, but they would be manufactured to our specific customized requirements and desires – thanks to the invention of 3D printers!

What is Big Data and 4 Reasons Why it is so Important

Big Data is one of the new terms that we hear regularly these days along with “Internet of Things” (IOT), “Artificial Intelligence” (AI), and “Cloud Computing”. Interestingly, when you Google the term, you can read a wide variety of definitions – which is not unexpected considering how new the concept is!

In this article, I would like to try to provide a simple understanding of the concept from a technical point of view.

One of the best definitions I encountered is this one: “any voluminous amount of structuredsemi-structured, and unstructured data that has the potential to be mined for information”.

This definition will make more sense when we compare it with how data was traditionally handled. For old developers like myself, useful data always meant structured tables of data organized in relational databases.  It enabled you to do meaningful searches and display the results in a useful format – like what you see in CRM or ERP applications you daily use. But as the above definition describes, the Big Data is now about big volumes of data that are not necessarily structured in tables of databases.

Why is Big Data mentioned now?

The first question that arises after reading the above lines would probably be: What has changed in the past couple of years that has made Big Data so important?

In my opinion, there are 3 main reasons:

  1. Advance of Artificial Intelligence technologies: masses of unstructured data have always been available – they are the source for any structured data! One of the main reasons we have been organizing such data into structured databases over the past few decades has been the lack of search and data mining algorithms. Finding and presenting the data in a meaningful and useful way is a complicated process that would require very sophisticated programming algorithms. However, these sophisticated programming algorithms are now available.
  2. Advances in computer hardware: storing high volumes of data, searching through them, and accessing useful data on a timely manner calls for advanced computer hardware that enables super-fast data access and very high-speed processing power. Such hardware was not widely available a decade ago.
  3. Advances of the Internet: there is no doubt that the invention of the Internet has been one of the most important events of the 20th During the 21st century, the internet has constantly provided connectivity at higher speeds with more mobility. The result is an access to an incredible amount of unstructured data from all over the world in the form of videos, pictures, text, and codes.
  4. Increased amount of data: this is usually referred to as the three ‘V’s – Volume, Velocity, and Variety. The volume of data, the speed it is becoming available, and the variety of the data has simply made traditional methods of structuring them impossible!

As the world becomes more connected, not only do we face a huge growth of man-created data, but also an exponentially increasing amount of data created by machines. Some examples of such machines are:

  • CCTV Cameras: there are an increasing number of constant streaming videos captured from CCTV cameras. The volume of data created by CCTV cameras would simply not enable any timely analysis by humans. It is also not possible to “structure” them in any meaningful way.
  • IOT sensors: it is expected that there would be over 20 billion IOT devices by 2020. Each of these devices would be constantly creating data. The variety of devices and hence the data they create would again call for the “Big Data” solution.
  • Network Equipment Logs: network switches, routers, security appliances, servers, and other network equipment each create their different logs which again is a huge amount of useful data, if it can be analyzed efficiently.

Conclusion

We’re obviously at the verge of a new technological revolution which would be known by the emergence of robots, artificial intelligence, IOT devices, virtual reality, auto-driven vehicles, and many other great technologies. But at the core of this revolution is the “data” and how it can be analyzed efficiently and intelligently. “Big Data” is the concept for making that happen and includes technologies on how to store such volumes of data, and more importantly how to extract the required information efficiently from it.

3 Very Simple yet Critical Improvements I Hope to See in LinkedIn

The LinkedIn – Microsoft deal announced last Monday has been probably one of the hottest topics in last week’s technology news. I’ve read many articles that are in favor or against this acquisition. I personally feel optimistic, especially after reading Jeff Weiner’s internal email to LinkedIn employees that was posted on Time.

At the same time, reading Jeff’s email as LinkedIn CEO and his emphasis on LinkedIn’s mission and culture, and the examples he’d shared on how this acquisition can lead to great improvements made me think of sharing some simpler ideas for improvement that probably millions of other active LinkedIn users would love to see. As a former software developer, I’ve tried to also share some practical fixes that can be applied without much effort:

  1. Improvement of LinkedIn Messaging

This is on the top of my list for simple but essential improvements. LinkedIn messaging is the core interface through which you can communicate with your connections. Surprisingly, it is unbelievably painful and inefficient to use:

  1. There are two totally separate messaging interfaces (one for LinkedIn and one for Sales Navigator)! They should be unified and combined.
  2. Both interfaces are extremely inefficient and hard to use. Having some 3000+ connections that I want to maintain active communication with, in my experience the interfaces are simply useless.
  3. There are no formatting features even for the simplest ones such as bolding or highlighting a phrase.
  4. There is no proper integration with email clients.

I would like to see a unified, modern messaging neatly integrated with email clients such as Outlook and with advanced search features that you will find in any other decent messaging interface.

  1. More Transparent Ban Policies

I’ve read numerous complaints and posts about this and have personally suffered from it as well. More interestingly, the answers I received from LinkedIn support convinced me that the LinkedIn staff themselves are even confused about this! It is very annoying that while LinkedIn is supposed to assist in the building of healthy business relationships, the more active you are in LinkedIn, the more susceptible you can be to having your account being blocked! Although I totally support the intention behind this (as listed in LinkedIn’s User Agreement), this is simply not being implemented properly. Some suggestions would be:

  1. A clear, transparent definition of how the ban policy is being triggered.
  2. A pre-alert notice giving the active users a chance to prevent their account being unjustly blocked.
  3. Temporary blockage of a user’s account should not also block/hide the LinkedIn groups he is owner of!
  4. Visibility to see if you’ve received any “spam” or “I don’t know” flags and a procedure to have them cleared!
  1. Correcting Group Blockage Mechanism

Again although the intention behind LinkedIn’s SWAM (Site-Wide Auto Moderation) has been to reduce spam in LinkedIn groups, it is simply not implemented properly. Being an active member of some 30+ groups, I can’t explain how disappointing it was to see my educational/informative posts have become subject to SWAM! I was further disappointed and frustrated when I noticed there is absolutely no way to find who has initiated the SWAM. I literally had to try to message every group owner asking them to let me know if they’ve done this. I was stopped after “InMailing” the first 15 as I’d reached my monthly quota and more interestingly, I never found out who’d caused this! Again this is one of shortfalls I’ve seen tens of posts and complaints about. Some easy suggestions to fix this without compromising the intention behind SWAM:

  1. Limit applying SWAM only to the Group that tagged the post as Spam.
  2. Inform the user the Group that has triggered the SWAM
  3. Inform the user when SWAM is lifted

Conclusion

While I see high hopes on the acquisition of LinkedIn by Microsoft as expressed by Jeff, and while I wish all the success to the LinkedIn folks, I believe there are some very simple improvements that can greatly increase users’ experience and satisfaction. Let’s not wait for the acquisition to be fully in place for seeing these changes implemented.

How the Mobile Phone Revolutionized Africa (Part 1): Banking and Education

This is the first article of the two-part series: “How the Mobile Phone Revolutionized Africa”. This first part focuses on Banking and Education, and the upcoming second part will focus on Disaster Management, Agriculture, and Health.

In a few years, the increase of mobile phone usage has transformed how people communicate and live in Africa. Mobile phones allowed Africans to buck the trend by skipping the landline stage of development and jumping straight into the digital age. Only about 2 percent of African households have a landline phone, but around 90 percent of adults own mobile phones. It is worth noting that most of the cell phones in Africa are what we call basic or feature phones – they are capable of calling, texting, and basic Internet browsing only.

For Africa, most of the people’s first experience with the internet comes through their mobile phones. Around 70% of mobile users browse the internet on their mobile phones, and Africa’s mobile broadband growth is increasing at a rate of more than 40%, which is twice the global average. This prevalence of mobile phones in Africa is mainly due to the weak landline infrastructure on the entire continent, which makes connecting through a desktop or laptop computer quite difficult. Mobile phones are also much cheaper to buy today, resulting in them being ubiquitous across the continent.

Below is an infographic showing the use of mobile phones in Africa:

Another reason for the great use of mobile phones in Africa is the frequent occurrence of power shortages that lead to blackouts. This directly led to more Africans adopting the use of mobile phones, since they do not need to be plugged in all the time. This has created a unique environment where mobile technology has been adapted for a wide range of usages, from lowering information barriers, to improving access to financial and health services, to boosting commerce, and bringing people together.

Below are some examples of how mobile phones have revolutionized communications and transformed the lives of Africans for the better:

Banking

M-PESA is a mobile money transfer service launched in 2007 by Safaricom, Kenya’s largest mobile operator, together with Vodafone. After five years, M-PESA is providing services to 15 million Kenyans, which is more than a third of the population of the country. According to a survey by the Gates Foundation and the World Bank, more than half of adults use mobile money in Kenya and Gabon,

The success of M-PESA in Kenya is inspiring similar initiatives across the entire continent. Mobile Banking is now extremely popular in Africa, especially since governments struggle to extend banking services to large numbers of the population, resulting to only one in five adults owning bank accounts across sub-Saharan Africa. Many Africans now use mobile money to pay their bills, buy goods, and make payments to individuals. Another popular use for mobile banking is for remittances from relatives living in other countries.

Education

Nokia capitalized on the growing popularity of mobile communications and social networking in South Africa to launch MoMaths, a mathematics teaching tool that targets users of the instant messaging platform MxitMxit is South Africa’s most popular social media platform, with more than 10 million active users in the country.

The potential for transforming Africa’s educational system using mobile technology is massive, as mobile phones are cheaper to own and easier to run than PCs, and they are gaining ground as tools for delivering teaching content. This facilitation of education through social networking and mobile networks will help reduce the number of African children who are not able to receive any formal education.

This article will be continued in the second part of this series titled: How the Mobile Phone Revolutionized Africa (Part 2): Disaster Management, Agriculture, & Health.

3 Revolutionary Ways of Providing Global Internet Coverage

Despite the constant internet penetration, according to a report by UN’s Broadband Commission, 54% of world’s 7.4 billion population still don’t have access to basic internet services. Two third of globe is also still lacking any internet coverage, making it extremely costly and financially unfeasible to provide internet access for some 700 million people living in these areas.

While almost everyone agrees on the importance and the great impact that internet connectivity can bring to these 4 billion of world’s population, this seems unlikely to happen through the known conventional methods of internet connectivity because:

  • Fiber cabling would be extremely costly and impractical for covering such extremely large and disbursed locations.
  • Mobile networks (or other kinds of ground wireless solutions) would be also not practical as they would need establishment of large towers at every 50 km or so which would have huge CPEX and OPEX costs and would not be feasible for mobile operators.
  • VSAT solutions would be more practical for such locations, however they are currently too expensive to be affordable for majority of the unconnected population.

So how would it be possible to practically provide low-cost (or even free) true global access to internet? The answer is low-orbit transponders in Sky!

Standard VSATs depend on Geostationary satellites located on Geostationary Earth Orbit (GEO) some 35,000 Km above the ground. Only at this orbit, the satellite speed of circulating around the earth would be equal to the speed of earth’s circulation around itself, causing a fixed satellite position so you can adjust your dish antennas towards the satellite. The drawbacks however are high costs of satellites and their launch, need for costly equipment on the ground including big directional antennas and also high latency as every data needs to travel 35,000 Km up to the satellite and back down to earth.

To solve these problems, the satellites or alternative transponders/repeaters should be positioned at lower distances from the earth (Low Earth Orbit – LEO). However by doing this, new problems arises:

  • Unlike a GEO satellite that could easily cover one third of globe’s surface, LEO solutions would only cover a few tens of kilometers and hence, to make true global coverage, hundreds or thousands of them would be required.
  • It would be extremely difficult and expensive to maintain a repeater at LEO heights at a fixed location. In case of LEO satellites to maintain within their orbit, they have to circulate around the earth at high speeds.
  • Due to above facts, to maintain link connectivity, the repeaters should be able to seamlessly pass on every single connection between each other.

Therefore, providing a solution consisting of hundreds or thousands of repeaters in the sky at LEO orbit, while keeping the solution low-cost and affordable is the challenge that needs to be tackled.

In this article, I will briefly explain the 3 revolutionary ways to provide global internet coverage:

  1. Using gas-filled balloons acting as satellites
  2. Using unmanned aircrafts with very large wings acting as satellites
  3. Using LEO (Low Earth Orbit) satellites

These 3 revolutionary ideas are actively and seriously persuaded by 5 leading companies: Alphabet (Google), Facebook, Space X, Boeing and OneWeb to tackle this challenge.

Google’s “Loon” Project

The Alphabet (Google) solution for providing global internet coverage is called the “Loon” project. The idea is to use a fleet of gas-filled balloons hovering around the world and powered by solar cells at estimated height of 20-30 km above ground. Each balloon is expected to operate for about 3 months at a time.

One of the main challenges of this project is how to maintain the location of these balloons. Google is trying to solve this challenge by using “software algorithms to determine where its balloons need to go and then move each one into a layer of wind blowing in the right direction”

If this challenge is resolved, the benefits of Google solution is it’s relatively lower cost of each balloon and the ability to maintain a balloon in a relatively small geographical area which would lessen the number of needed balloons.

Google has been testing some of their balloons but they do not yet give a clear timeline for when it would be put in real work, probably due to predictable very tough technical challenges.

Facebook’s “Aquila” Project

Facebook has selected to invest on unmanned aircrafts with very large wings (wingspan of about 40 meters) covered by solar cells, enabling them to stay on the sky for about 3 months at height of about 20 Km from the ground. Unlike Google’s solution that uses LTE technology, Facebook’s “Aquila” is investing on using laser beams to deliver high-speed internet within an 80 Km radius on the ground below.

What stage this project exactly is? I couldn’t find any clear public information. But the main challenges of the project in my opinion would be the durability of the drones and keeping them cost effective.

OneWeb’s LEO Satellite Fleet

OneWeb in partnership with Qualcomm and Airbus, has taken a more “classic” approach – by setting up a fleet of about 700 low-costs satellites circulating on LEO orbit at about 1200 km altitude. Actually, this is not a new concept and is already in place by Iridium – a network of 66 satellites orbiting at a height of about 780 Km and providing global phone and low-speed data coverage for Iridium devices since 1998. (Iridium has now started launching their new Iridium NEXT fleet of satellites what would provide data connectivity with speeds up to 8 Mbps)

What makes OneWeb’s solution different from Iridium is that they are investing on mass-production of satellites to considerably decrease the costs per satellite and of course to provide high-speed internet connections at much lower costs. Also, OneWeb’s user terminals would provide LTE, 3G and WiFi internet connection to surrounding areas.

OneWeb’s solution is expected to begin its services by 2019.

SpaceX and Boeing Satellite Solutions

There are also other companies racing with OneWeb on setting up large LEO satellite fleets to provide global internet connectivity – namely SpaceX and Boeing.

Many of us know SpaceX for its services to the International Space Station. They have exposed a plan to launch some 4000 small, low-cost, disposable satellites at about the same altitude as OneWeb. This project is also funded by Google and tests are expected to begin in 2016.

Boeing joined the battle just this June, by revealing its plans for deploying some 3000 satellites, 1400 of which are to be put in orbit within 6 years. Interestingly, Boeing is also planning to have its satellites at the same 1200km LEO orbit.

Conclusion

While we should wait and see who would win the battle of technologies for providing low-cost, global internet services, all these attempts show a promising future where everyone on the planet would have low cost (if not free) access to internet which has been just recently acknowledged by a UN resolution as a basic human right.

The Role of VSAT in Supporting NGOs during Disasters in Africa (Part 2): Zambia and Cape Verde

This is the second article of the two-part series “The Role of VSAT in Supporting NGOs during Disasters in Africa”. The first article focused on telemedicine projects in Mozambique and Uganda. This article will look at the role of VSAT during disasters in two more African countries: Zambia and Cape Verde.

Emergency telecommunications play a critical role in the immediate aftermath of disasters by ensuring the timely flow of vital information that is much needed by government agencies and other humanitarian actors involved in rescue operations and providing medical assistance to the injured. The impact of disasters is even worse for those living in remote and isolated areas with no access to basic information and communication facilities that are essential in providing the alerts so vital to saving lives.

The best emergency solution to utilize during emergencies is VSAT technology. VSAT is not affected by natural calamities like earthquakes, floods, and storms as much as terrestrial networks. This is why VSAT technology directly supports many NGOs and military operations, allowing them to cope with contingencies. Because of this, the International Telecommunications Union (ITU) considers emergency telecommunications such as VSAT to be a core element of its projects that integrate telecommunications/information and communication technologies in disaster prediction, detection and alerting.

Emergency VSAT Solutions – Saving Lives During Disasters

1) Flood in Zambia 2008

The main emergencies that occur in Zambia are very much water-related and are predictable. Every year, there are floods along the river areas, primarily the Zambezi belt. When floods occur, people are often displaced. In 2008/2009 floods, over 4,000 people were displaced along the Zambezi belt. The 2008/9 rain season peaked in January 2009 with all parts of Zambia receiving normal to above normal rainfall The heavy precipitation in the country, coupled with similar rainfall in neighboring Angola, caused flooding along the Zambezi and Kwando Rivers, which displaced over 102,000 households, damaged growing and matured crops, and caused significant threats of waterborne diseases.  The five affected provinces were the Western, North-Western, Eastern Luapula and parts of the Northern Provinces. The government undertook rapid assessments in the affected districts, detailing the immediate need of food aid, shelter, clean and safe water, and rehabilitation of infrastructure.

The International Telecommunications Union (ITU) provided VSAT satellite terminals to Zambia to assist officials in their relief efforts after severe floods affected 19 districts across the country. The floods destroyed roads and terrestrial communication links, hampering the coordination and delivery of assistance. This deployment of emergency VSAT solutions proved critical for the government and allowed humanitarian aid agencies to conduct rescue operations, medical assistance, and recovery. The VSAT mobile terminals deployed by the ITU were easily transported by road and air to the affected regions, and the VSAT terminals facilitated the coordination of relief operations by both government and humanitarian agencies to aid the victims.

2) Volcano Eruption in Cape Verde

The eruption of the Pico de Fogo volcano began on the 23rd of November, 2014 and continued until the 8th of February, 2015. By the end of the eruption, the lava had covered an era of approximately 520 hectares with an average 8-meter height lava wall. The 88 days of intense and effusive eruption culminated in the total destruction of all houses and community infrastructures of the localities of Portela and Bangaeira – Chã das Caldeiras, forcing the evacuation and displacement of 994 people. As of the 8th of December, 2014, lava had destroyed 90 buildings, including the national park headquarters, wine production facilities, a primary school and a hotel, as well as more than 429 hectares of land, resulting in great material and economic loss and leaving many without a source of income.

The International Telecommunications Union (ITU)  deployed VSAT communication equipment following the eruption of the Fogo Volcano on the 24th of November 2014, which affected most of the population of Fogo Island. The VSAT equipment was used for coordination and relief activities on the ground. The ITU deployed Iridium satellite VSAT communication terminals to support the preparedness and rescue activities.

Vizocom has an NGO Support Program, where Vizocom will provide fast and reliable communication services with exceptionally low prices to support NGOs and their causes.

The Role of VSAT in Supporting NGOs during Disasters in Africa (Part 1): Mozambique and Uganda

Natural disasters such as floods, fires, and storms affect thousands of people in Africa. From the destruction of buildings to the spread of disease, natural disasters can devastate entire countries overnight and seriously disrupt the community with massive human, material, economic and environmental losses. To prevent these losses during disasters, emergency communication systems are critical in terms of safety, and ensuring the continuous operation and rapid recovery of emergency communication systems is more important than ever.

The best emergency solution to utilize in these situations is VSAT technology. VSAT solutions act as very dependable backbones for communications during and after calamities. The inherent nature of VSAT communications via satellite and its connectivity advantages makes VSAT the ideal means of communication during emergencies.

During disasters, the first action should be to connect the affected site to multiple other sites, and this can be done quickly using VSAT. The other important tool for communication is the satellite phone , which does not rely on ground infrastructure for connectivity. Below are examples of how VSAT solutions have directly supported the NGO’s relief operations during disasters.

Emergency VSAT Solutions – Saving Lives during Disasters

1. Cyclone in Mozambique in 2008

The tropical cyclone Jokwe hit northern and central Mozambique on the 9th of March, 2008. The Category 4 cyclone had winds of up to 170 Km per hour and brought torrential rains, prompting the government to declare a Red Alert, which is the highest level issued for natural disasters. The red alert was issued for the Provinces of Nampula, Zambézia and Sofala, as well as the coastal areas of the Districts of Maganja da Costa, Pebane, Moma, Angoche, Mogovolas, Mogincual, Mossuril, and Nacala. A lesser Yellow Alert was issued in the central provinces, specifically in the districts of Inhassunge, Chinde, Marromeu, Chiringoma and Dondo. According to the Government National Institute for Disaster Management (INGC),tropical cyclone Jokwe killed 7 people, damaged around 30,000 houses, 200 schoolrooms, and dozens of health clinics, prisons and other public buildings. An estimated 41,000 hectares of maize were destroyed.

 

The Emergency Telecommunication Cluster (ETC), with support from Telecom sans Frontieres, installed VSAT equipment and provided support to INGC and the humanitarian community in each of the emergency operation centers in Caia, Mutarara, and Mopeia. Data connectivity was provided in Caia through an ETC VSAT station; in Mutarara, through the World Vision VSAT station; and in Mopeia, using UNICEF‘s BGAN portable satellite terminal. The emergency VSAT systems in place helped the NGOs conduct rapid emergency procedures. Telecom sans Frontieres also installed a BGAN and proxy-server in Caia to decrease the usage load on the VSAT at the CENOE office. Lacking outside contributions, the Emergency Telecommunication Cluster used advanced funds from UNICEF and WFP.

2. Flood in Uganda

Unusually heavy rainfall from July to November of 2007 led to flooding and water-logging across a number of districts in eastern and northern Uganda, particularly in the Districts of Soroti, Amuria, Katakwi, Bukedea, Kumi, Lira and Sironko. This gave rise to a major humanitarian response across all sectors. An estimated 20,000 households were severely affected and 58,000 people were displaced. With about 80 percent of crops destroyed by floods, food insecurity was imminent. The flooding disrupted delivery of social and economic services like education, health, trade and agriculture – which resulted in increased risk of communicable diseases especially as the floodwater receded. Malaria and diarrheal disease incidences greatly increased by over 30%. Several districts were ravaged by torrential rains and flash floods that swept through the country, destroying road and communication links, and submerging crops, which compelled the Government to declare a state of emergency.

The International Telecommunications Union (ITU) deployed 25 VSAT terminals to help restore vital communication links in the aftermath of severe floods that affected the eastern and northern regions of Uganda. With the restoration of the communication links, designated government officials and other humanitarian agencies were able to coordinate relief operations efficiently. The ITU provided bothThuraya hand-held satellite phones and Inmarsat Global Area Network (GAN)terminals. The Thuraya satellite phones used both satellite and GSM networks to accurately locate the GPS coordinates for the aid relief and rescue. The Inmarsat GAN terminals were mainly used for voice communications and high-speed data.

This article will be continued in the second part of this series titled: The Role of VSAT in Supporting NGOs during Disasters in Africa (Part 2): Zambia and Cape Verde.

Vizocom has an NGO Support Program, where Vizocom will provide fast and reliable communication services with exceptionally low prices to support NGOs and their causes.

CAT8 Cabling – What Is It and When Will It be Out for Use?

If you’ve been in LAN cabling business, you should have heard about CAT5, CAT6 and probably CAT7 standards. It would not be therefore hard to guess that the next standard in network copper cabling would be CAT8 (Category 8) standard.

In this article I try to give a very short and effective idea of what all these category standards are about and then explain what you should expect from the next upcoming standard – CAT 8.

Category Standards for Twisted Pair Cables

Category standards defined by ISO/IEC 11801 international standard define the characteristics of telecommunication cabling systems for both twisted pair and FO cabling. Here I will only focus on twisted pair.

Without making it should too complicated, the reason you see copper cables used in communications (being it voice or data) are twisted as pairs, is because by twisting the two wires that are used for transmission of communication signals, the wires would to some extent “contain” the electromagnetic field that would be created as a result of such electrical signals passing the wires.

The actual speed of data you can pass through a twisted pair would be limited by the frequency characteristics of the twisted wire/cable. The higher the frequency characteristics, it means that the cable can “contain” the electromagnetic fields at higher frequencies allowing higher speed throughputs. (If can be mathematically proved that the higher the data speed, the higher is the created frequency).

One of the factors that increases frequency characteristics of twisted pair cables is how much “twisted they are” – more twist equals better frequency characteristics. The other factor would be the separation between different twisted pairs including by using individual shielded pairs.

Now, as the technology evolves there is increasing demand for higher data speeds and hence the need for defining and manufacturing twisted pair cables that can support those higher speeds.

The table below briefly shows the current Category standards for twisted pair cables (I’ve simplified it a bit):

Standard Frequency Limit Data Speed Date of Usage
CAT 2 1 MHz 4 Mbps 1980s
CAT 3 16 MHz 10 Mbps 1990s
CAT 5/5e 100 MHz 100 Mbps 2000-2010
CAT 6 250 MHz 1 Gbps 2005 – onwards
CAT 7 600 MHz 10 Gbps 2010 – onwards
CAT 7a 1000 MHZ 10 Gbps+ 2014 – onwards

Note: CAT5e standard was introduced as improvement of CAT5e and can actually support up to 1Gbps. There is a CAT6a standard commonly referred to by manufacturers but that is not actually an official ISO standard.

Category 8 Standard

As explained above, obviously CAT8 standard is expected to provide better frequency characteristics / hence support for higher speeds of data.

The standard has been under development since March 2013 and a draft was finally published for review in June 2016. The final version of the standard can be out as early as Q4 2016.

CAT8 is expected to support bandwidths of up to 2 GHz (2000 MHz) for up to 30 meters of cabling and can support 25Gbps / 40Gbps speeds.

CAT8 cables will look similar to lower category cables and will most probably still be terminated in RJ45 connections.

As the cable length for such speeds is limited to 30 meters, they will most probably be used in form of factory-made patch cords for interconnection of equipment in data centers.

Conclusion

The current CAT8 standard is expected to be finalized sometime in late 2016. We might need to wait for another year before seeing related products (CAT8 cables, connectors, patch panels …) in the market.

However at least for now, it seems that CAT8 usage will be more limited to server rooms and data-centers as an easier alternative to make high-speed interconnection between servers and networking appliances by using the very much familiar copper patch cords.

5G – When Can You Really Expect to See Next-Gen Mobile Technology?

These days there are lots of talks about “5G”, the new Fifth Generation of mobile connectivity technology. Many of major operators such as Samsung, AT&T, Verizon, Nokia, and Huawei are working full force to make 5G happen in a tight competition.

But what exactly is 5G and when will it be in use?

What is 5G?

Until now, the only issue that can be confidently said about 5G is that obviously it is the fifth generation of mobile network technology, and it promises 4 major improvements:

1) Much Higher Speed

5G promises to provide speeds as high as 10Gbps (in theory) and as high as 100Mbps in congested networks which is multiple times higher than the current 4G platform.

2) Lower Latency

While 4G has a latency of about 30-50ms, the latency of 5G is expected to be in the range of 1ms or less.

3) Number of Connections

While 4G networks can provide up to thousands of connections, a 5G network is expected to increase that to millions of connections per square kilometer. This is especially important with regards to the expected explosive growth of IoT devices to about 20 billion devices by 2020.

4) Lower power consumption

5G is expected to consume less battery power than 4G.

Apart from the above assumptions, the actual technical details of 5G is not yet defined, and there is a fierce competition to finalize the standard, which is expected no sooner than 2018.

When will 5G become available?

The current estimates are talking about 2020 as the year when we can start using 5G. There are talks about providing limited 5G services as early as 2018 to cover the Winter Olympics in South Korea.

However, when considering all the remaining challenges that need to be resolved before 5G actually replaces the 4G infrastructure, it seems we still are a few more years away from having a widely-spread 5G network.

Some of these challenges are:

1) Defining the final standard

2) Providing the required backbone infrastructure that can handle the very high speeds that is required

3) Embedding the required hardware in all mobile cells and other mobile devices.

Conclusion

5G network is inevitable – for sure it is the required communication infrastructure for the 3rd decade of the 21st century to complement other emerging technologies such as , Artificial Intelligence (AI), and live video communications. However it seems we would need to wait for 5 more years to see a wide coverage of 5G for our daily use.

What is WiMAX and How Does it Differ from WiFi?

When speaking about wireless networks, you might have heard the term WiMAX increasingly used as a technology that will replace WiFi. If you are curious on what the differences between these two are, then this article is meant to exactly answer your questions.

WiMAX stands for “Worldwide Interoperability for Microwave Access” and is a standard-based technology for providing a wireless alternative to cable and DSL connections.

This however is also one of the usages of WiFi. Although WiFi wireless devices are mainly used for short-range wireless connection of end user devices such as laptops, tablets and smartphones, they are also used for site-to-site interconnections.

Before I explain the core difference of the two, let’s first take a look at the table below which gives some of the basic differences between the two wireless standards:

Specifications WiMAX WiFi
IEEE Standard 802.16x 802.11x
Versions of standard 802.16a, 802.16d and 802.16e 802.11b, 802.11g, 802.11n
Official Release 1997 2004
Frequency bands supported 2.5,3.5 and 5.8GHz supported 2.4 GHz and 5 GHz supported
Data rate 30-40Mbps, but lately updated to 1Gbps 54Mbps, but lately up to 1.2Gbps
Channel Bandwidth Flexible (1.25 to 20 MHz) 10 or 20 or 40 MHz
Normal Ranges 30+ Km 100m for end-user devices (up to 5Km for outdoor point to point connections)

What is the main technical benefit of WiMAX?

WiMAX is not a replacement technology to WiFi – instead, while WiFi is the de-facto global standard for wireless interconnection of end-user devices, WiMAX has addressed a specific technical deficiency of WiFi for interconnection of multiple sites.

The main drawback of WiFi technology for a point-to-multipoint connection is that it is a connectionless type of protocol named CSMA/CA (Carrier sense multiple access with collision avoidance). Without going into deep technical details, this means that as in WiFi networks all the devices of the network share the same frequency channel, to prevent collision in data transmissions, each device “listens” to make sure no other device is transmitting and then it transmits its data. I.e. there is no centralized management in the network. While this makes the network setup very simple and straightforward (which is a benefit for end-user devices), it creates major problems in larger networks especially when the distances are increased.

scheduling algorithm. Unlike a WiFi network, in WiMAX you should define and setup each subscriber station (SS) on the base station including specifying what bandwidth each SS should be given. By doing this, the base station knows the exact number of subscriber stations and allocates a time slot (access slot) to each. This protocol synchronizes the transmission of data between all the stations on the network and totally eliminates the collision issues of a WiFi network. This enables efficient and reliable connection of as many as 80 subscribers on a WiMAX network with guaranteed QoS (Quality of Service), while on an outdoor WiFi network, adding more than 10 CPEs would cause great deficiency with unpredictable quality of service.

To give an example, WiFi is like a crossroad with no traffic light where cars need to check and make sure no-one else is crossing before moving on, while WiMAX is when you have a traffic police (the base station) giving turn to each car to pass.

Conclusion

While WiFi is and will be widely used for short-range wireless connection of end-user devices, WiMAX is the correct, efficient wireless solution for long-range connection of multiple sites such as providing internet connection to multiple homes or interconnection of multiple buildings in a large compound.

5 Key Factors in Designing a Point to Point Microwave Link

Wireless and microwave point to point links are widely used as a quick-to-deploy and cost effective alternative to fiber optic cabling for interconnecting the network of two sites with distances of few hundred meters and up to 50 km or more.

However like any other solution and probably more than many others, establishing a reliable and high-quality microwave point to point link can be quite challenging, and if it is not properly designed and implemented, it can cause major quality issues such as lower throughputs, link instability, and longer than expected latency.

In this article, I have tried to very briefly explain some key factors that should be considered in a proper point to point microwave link design – so that IT managers and engineers who are not experts in the field would have enough idea to enable them to properly evaluate and control such work.

What is a good point to point microwave link?

The quality of a point to point microwave link can be determined by below measurements:

1) Signal to Noise Ratio (SNR):

This ratio is measured by dB and shows the strength of signal vs the noise level for that frequency channel. The higher the value, the better but it should be at least 20 dB.

2) Bit Error Rate (BER)

This figure shows the % of bits of data with errors vs the total number of bits that have been transmitted during a period of time. The value is usually expressed as 10 to a negative power. The lower this figure, the better is the link quality. Good BER rates are usually in range of 10 -8 or better.

3) Bandwidth Throughput

This is the actual amount of data that can be transferred per second and is expressed by bits per second – for example a bandwidth throughput of 100 Mbps means about 100 megabits of data can be transferred by the link in every second. Obviously the larger this figure, the better the link.

4) Latency

Link latency determines how much time it would take to transfer the data – for a good microwave link, the latency should be fixed and not going over 2-3 ms. The easiest way to check the latency is to ping the destination device.

5) Link Availability

This parameter is expressed in % and determines for what % of time the link has been established over a certain period of time, usually in a 12 months period. A reliable microwave link should have link availability as good as 99.999%.

As microwave links can be well affected by time of day as well as many other geographical factors, for critical links it would be important to have a constant test of at least 48 hours.

5 Key Factors for a Stable Microwave Link

Below are 5 key factors you would need to ensure about for having a reliable and stable microwave link:

1) Frequency Selection

Microwave links range from 2.4GHz to 42GHz spectrum. The higher the frequency, the higher the available capacity but at the same time, the effective range is lowered and the link would be more susceptible to rain or high humidity. To use a frequency, a license should usually be obtained from the legal authorities of the country. There are also a few frequency bands that are “license-free” – mainly 2.4GHz, 5GHz and 24GHz.

While these license-free bands are in much greater use, professional solutions more depend on utilizing licensed frequencies which would guarantee a free-to-use spectrum greatly improving link reliability.

2) Calculating Capacity

The required capacity (bandwidth throughput) of a point to point microwave link is a key design parameter.

As the capacity increases, you would need to design the link for a higher SNR, resulting the need for stronger equipment and antennas.

3) Calculation of Line of Sight and Path Loss

For point to point microwave links, the antenna on the two sides should be in line of sight of each other. The line of sight can be limited by natural or man-made obstacles and also by the earth’s curvature which limits the practical distance of microwave links to 50-60kms (which would call for 100m tower heights and large dish antennas to achieve).

There are now many computer applications that can accurately predict the line of sight and path loss however a visual survey by an experienced engineer is also necessary.

4) Interference and Fading

Interference and fading is another issue that if not handled correctly, can considerably affect the link reliability.

Apart from issues such as Fresnel zone, rain fade, and multipath fading which require proper consideration during the path loss calculation, there are also other factors causing interference such as installing the radios adjacent to other radios which would greatly affect the receiving sensitivity of the radio (like when you try to hear a weak voice when standing beside a big speaker that is playing music).

5) Redundancy

The reliability of the link can be greatly increased by applying redundancy. In frequencies of 7GHz and above, dual redundant radios can be connected to the same antenna.

But this is not possible in lower frequencies and two independent radio links should be installed with sufficient frequency and space diversity.

Who can design and implement a successful point to point microwave link?

Design and implementation of a successful and reliable point to point microwave link requires good theoretical knowledge about RF design and antennas, as well as good deal of practical experience.

The concepts mentioned above are the primary information you can ask an implementer to make sure they have the required knowledge and expertise. You should also ensure you receive clear test reports for the established link.

6 IoT Applications that Improved People’s Lives in Africa – A Story of 6 Countries

As explained in the previous posts “What is IoT? A short, simple explanation” and “Top 5 Ideas for IoT That Could Change your Life“, the Internet of Things (IoT)-related technologies are currently booming at an unprecedented pace. There are hundreds of thousands of new ideas on how businesses can benefit from the IoT concept, and this list is expanding every single day.

The last seven years have seen a rise in activities geared towards IoT across the globe among technology practitioners, private businesses and education institutions. Back in 2013, it was estimated that there were about 80 things being connected to the internet per second, and by 2020 it is estimated that about 250 things will be connected to the internet per second, that’s´ about 50 billion things in total all connected the internet.

With this massive number of interconnected things, businesses all over the world are positioning themselves to tap into the huge potential that IoT brings. The African region has been markedly slower in embracing the IoT concept compared to most developed nations, but Africa is now increasing it level of intake of IoT. Businesses in countries all over Africa are now using IoT applications to improve their business environment and to improve the lives of the citizens.

IoT adoption in Africa is now an area of great interest. Below are great examples of how IoT has helped businesses and revolutionize peoples´ lives in 6 countries in Africa: Tanzania, South Africa, Kenya, Nigeria, Egypt, and Namibia.

1. Preventing Oil Pilferage in Tanzania

Usangu  Logistics is  a heavy  transport  company with a fleet of over 100 trucks and tankers dedicated to serving thousands of customers in Tanzania with oil,  lubricants,  and  other bulky products. One immediate challenge that the company faced was that after a tanker is loaded with the product for transport to various locations, the drivers would often pilferage the oil along the way, and would later sell the stolen oil in the black market. The company’s trucks and tankers used a combination of a lock  system  intertwined  with  a  metal  loop  that  is  fitted around the closing mechanism of the tank’s hatch. However, the system could not prevent the driver in possession of the lock’s combination from opening the hatch. The company could not completely control the drivers and did not know when, where, and how much oil has been stolen along the way. This resulted to a big loss for the company, and it prompted an immediate solution that would solve the problem.

The IoT Solution – RFID

The immediate solution came in the form of an IoT application though the use of radio frequency identification (RFID). An  IoT-enabled  gateway  device  is  attached  to  the  truck’s  cabin area,  and the  seals are tagged with  RFID-enabled  tags which are  fastened  to the tracks´ hatch. The tag transmits signals to the gateway device every eight seconds, and the signal is sent to HQ or main office for interpretation and further action. The software will store the seal status and location of the trucks, so the truck and seal information can be monitored in real-time. Any attempt to open the hatch is recorded, and the culprit can immediately be known. The implementation of this IoT-enabled solution resulted to a very severe drop in cases of pilfering of the oil that the trucks and tankers were carrying.

2. Electronic Tolling System in South Africa

IoT began  in  South  Africa  over  a  decade  ago,  and  has  been shaping  the  country  for  the  last  ten  years  even  without many people noticing it. South Africa has been building IoT technology for many years, including the building of a nationwide network of sensors to connect everything from electricity  grids to traffic  controls.

E-toll System in Gauteng Highway

At  the  beginning  of  2012,  the  South  African National  Roads  Agency  Limited  (SANRAL)  introduced an IoT-based E-tolling system in Gauteng Highway. The E-toll system called  the  Open  Road  Tolling  is  meant  to  collect  tolls electronically  without  human  intervention  since  there  are no physical booths on the highway. The IoT system charges all vehicles using the highway without them slowing down or stopping. Simple overhead gantries are fitted with toll  collection  devices which have the capability  to  recognize  an electronic tag  attached  to  the  vehicles  as  it  passes  through  the  gantries. The vehicle owners  are  supposed to  purchase the IoT-based electronic tags and fit  it in  their  vehicles,  and the  tags  can  also  be  loaded whenever  the  credit  gets to zero.  With this IoT-based technology, traffic jams have been reduced dramatically. The IoT-based tags can be easily purchased or reloaded at stores around the country.

3. Waste Management Systems in Kenya

Nairobi County in Kenya have been grappling with waste management issues for a long time. In order  to  tackle this  problem, Nairobi officials approached  IBM  to  develop  an  IoT- based  application  for  waste management. Basically,  the  idea  is  to  develop  a  solution  that can  be  installed  in  the waste  collection  fleet  to  monitor them in real-time. The IoT application is also meant to create a digital map of the Nairobi streets.

oT-Based Smart Sensors for Waste Management

The IoT-based solution called for the fleet of waste collection trucks to be installed with smart sensors that would tell when the vehicles are in the garage or on the road. The IoT-based sensors can also check dumpsites to see if they are full and need to be drained, checks how long the waste collection truck has taken in traffic, and the time they take to collect garbage. The  IoT application  is  also  expected  to automatically monitor the driver’s behavior, detect  speed  bumps and  potholes,   and check  fuel  usage  by  the  driver.  The IoT-based initiative has  enabled  Nairobi County to  track  the  garbage  fleet  and  ensure that  the  trucks  are  doing  their  job  at the allotted  time.  The smart sensors allowed Nairobi County to see great improvements during the trial period as collected waste volumes tremendously increased.

4. Product Verification Initiative in Nigeria

Faced  with a perennial  drug counterfeiting  problem,  Nigerias´  National Agency for Food  and Drug  Administration and Control (NAFDAC)  in  2010  resorted  to  the IoT-based product  verification initiative  to curb drug counterfeiting by using Radio  Frequency  Identification  (RFID). The IoT-based technology was carried out in collaboration with Verification Technology Limited (VTL). The IoT solution  used  tags equipped  with  RFID  to  secure  the  integrity  of  the  drugs throughout  the  supply  chain,  starting  from the manufacturers,  to the distributors,  wholesalers,  retailers, and consumers.

RFID Tags to Prevent Counterfeit Drugs

The RFID tags are expected to track down the drug’s path as it moves across the supply chain. In order to verify the drug’s authenticity, special RFID scanners will be placed at the port of entry. It is also expected that RFID scanners will be purchased by hospitals, pharmacists, and manufacturers in order to have a collective effort in dealing with the problem of drug counterfeiting in Nigeria.

5. Remote Appliance Control in Egypt

Egypt has shown that IoT solutions can be used to solve societal problems through innovation. A  Cairo -based  technology firm called Integreight announced that it has developed an IoT chip that can be integrated  with  modern appliances  like  refrigerators,  cameras, TVs, washing machines, etc. This IoT-based application named 1sheeld gives users the capability to use their appliances remotely by simply connecting the chip to their smartphones.

Remote Control through 1sheeld

The 1sheeld technology uses an Arduiono board, and the 1sheeld application can then be accessed from a smartphone by using Bluetooth.  Using  the 1shield  library, codes can be written into  the  Arduiono  software application  before  uploading  it to  the  board.  This allows the control of many different sensors that are available in the board. There are other IoT-based proposals underway in Egypt, including using sensors to undertake precision potato farming and bee keeping.

6. Electronic Dispensing Tools (EDT) in Namibia

The small South Western African nation is not to be left behind in the field of IoT. In order to improve the effectiveness of antiretroviral drugs, Namibia implemented an IoT-based electronic dispensing tool.  Pharmacists must dispense the correct medicine in correct amounts to patients, and if a patient misses medication or is given too much, it becomes a very big health problem.  Pharmacists  require  at least some  minimal  information  about  the  patients’ medical history,  and this  is extremely necessary  if  the  patient  needs optimized  care,  and  for  the  pharmaceutical  providers to effectively manage their medicine inventory.

EDT for Accurate Dispensing of Medicine

Electronic Dispensing Tools help pharmaceutical providers to collect, manage, and generate the necessary records that are useful for accurate dispensation of medicine. The data collected includes the patients’ profiles and the medicine inventory. The IoT-based devices can also manage the inventory and logistics of the medicine, alert patients of upcoming appointments using SMS, allow users to work on the same database at the same time, and allows for customized medical reporting functions.

To learn more about the IoT Progress Report for Africa, you can read the very comprehensive paperby Nashon Onyalo, Hosea Kandie, and Josiah Njuki, which was published in the International Journal of Computer Science and Software Engineering (IJCSSE).

In Africa, almost all the countries have developing economies, and they will benefit the most in adopting applications developed with IoT platforms. IoT will change peoples´ lives and improve processes, services, and ways of life. IoT is a cutting-edge technology that is best suited to developing markets, bringing with it flexible connectivity for devices across the entire African region.

Simplicity in Procurement

A reliable procurement system is designed for speed, efficiency and accuracy, and one of the best ways to achieve the intended design is to keep things simple. However, despite all precautions, problems ranging from human error to organizational shortcomings can still have a negative effect on procurement and purchasing ability, and the greatest issue plaguing procurement today is complexity.

The complexity in the procurement process is one of the main reasons why procurement has a diminished reputation today, as compared to previous years. The teams in the procurement community are aiming to help their business counterparts lessen and ease the pressure on their budgets, and making sure that standards like consistency and quality are upheld in their suppliers.

The huge problem is that no matter how hard procurement tries to ease the burden of companies, complexity is still a huge factor in the business. If procurement asks the company to use a specific software, or to fill out a form to make a purchase, that is an additional process that was not done before. In the old days, the mantra was: “Buy what is needed, when it is needed”. However, in today’s business, complexity in the procurement process is mounting, and the industry must work together to bring back simplicity as one of the core precepts for procurement.

The reasons to bring back simplicity in procurement:

1) Simplicity means speed

If the procurement team can prove to the stakeholders that working with them doesn’t slow things down, that can remove a huge amount of anxiety from the stakeholders. For stakeholders, the purchasing process shouldn’t be complex. Procurement may require intelligent solutions to manage suppliers and handle payments, but it should be kept as simple as possible. Simplicity in the procurement process frees up resources for value-adding activities, rather than consuming it.

2) Simplicity enables big ideas

In order to function better, procurement departments must understand their company’s concerns and be able to demonstrate how they measure up against these concerns. If the company’s requirement spans across different disciplines making it very complicated, the procurement department’s job must be to convert that complicated requirement into a simple one. Procurement must come up with simple ways and big ideas of measuring a requirement and then turning that into a strategy that takes the complexity away.

3) Simplicity means lean

No company needs to have thousands of suppliers. Procurement should learn to do more with less. The key to the future of procurement is deploying the most effective capabilities where they can make the most impact and automating everything else to keep operations lean and simple, without the unwanted complexity.

4) Simplicity is inevitable

The longer you spend creating a complex system, the more you consequently spend updating, rolling out and enforcing that system. Complexity breeds complexity and it’s only a matter of time until the complexity lessens efficiency and drives up costs immensely. Procurement is at its best when it can help stakeholders make a quick decision, rather than impose a lot of rules and forms that bogs the process down. The direction where procurement must go to achieve simplicity is stepping beyond the established automated purchasing setup and using data and expertise to enable strategic decision-making. This can be done with the help of sophisticated set of inputs and informed analysis that can help transform business complexity into a simple process.

5 Key Factors in Designing a Wireless Network for your Business

Wireless networks have become an integral part of any business environment these days, especially due to the increased prominence of all kinds of wireless devices such as laptops, tablets, and smartphones. Wireless networks are now ubiquitous because they provide the expected convenience, portability, and flexibility demanded by any serious business today. Another reason you see such an increase in demand for setup of wireless networks in business environments is due to the fact that a wireless network is now much faster and more reliable than before.

In this article, I will try to briefly explain some key factors that any business person who wants to set up or request a wireless network should be familiar with. It greatly helps to professionally communicate with your team and the company you contract to setup your wireless network.

What do we mean by “Wireless”?

The term wireless is of course a very broad and general term and can be applied to any device or technology that works without wires! Wireless communications include GSM, WIMAX, satellite, radio, microwave, Bluetooth, and many other means of communications.

However when we are talking about wireless network for computers, we are loosely using the term to refer to “WiFi” or the IEEE 802.11 standard. This is the common wireless (WiFi) connection you have on your laptops, tablets, and smartphones.

5 Key factors of a Wireless Network

Below are 5 key factors you would need to ensure about in any wireless network that is setup you’re your business:

1) Indoor / Outdoor Access Points

Regardless of the brand, the wireless access points are divided to indoor and outdoor devices. Indoor access points are not weatherproof but cost less and are installed on wall or ceiling of offices.

Outdoor access points on the other hand are designed to withstand outdoor climates and are usually used to cover the outside premises of your office or compound.

There are however cases where outdoor access points are used to provide coverage for indoor use – this is when you have disbursed, separate prefab caravans or trailers or tents (like oil rigs or man camps) where providing coverage for inside these rooms more efficient by utilizing outdoor access points.

2)  Number of Access Points
Determining the correct number of required access points and their proper positioning to provide proper coverage is probably the most critical step that determines the success of a wireless network.

There are many factors for determining the required number of access points, but the two main ones are:

a. You need to make sure you have all locations within the range of the wireless access point.

b. You need to make sure you have enough access points in crowded areas (one per every 20-30 users).

3) Access Point Bandwidth/Speed

The speed and bandwidth offered by wireless access points is increasing year by year at an incredible speed. While the speed of access points was in range of 11 to a maximum of 54Mbps back in 2000’s (IEEE 802.11 a/b/g standards), now thanks to the advent of many new technologies such as MIMO, the common standard is IEEE 802.11ac which provides speeds as high as 780Mbps. New standards are expected as early as 2017 to support speeds up to 100Gbps! If you are planning to setup your wireless network this year, you should go for the 802.11ac standard or newer.

4) Access Point Frequency Spectrum/Band

Older access points tended to support 2.4GHz band. The new 802.11ac standard also supports 5GHz band, enabling higher bandwidth and speed.

5) WLAN Controller

While for a very small office a standalone access point might be sufficient, in larger premises where you have to setup multiple access points, a WLAN controller would become quite critical.

The WLAN controller enables management of multiple access points, not only simplifying the management of all the access points from a single control point, but also to enable the seamless roaming of wireless devices on the move. Some manufacturers now provide software services as an alternative to hardware WLAN controllers which would provide a very attractive and cost effective alternative for SMBs.

Who can design and implement a successful wireless network?

Whether you want to test the wireless knowledge of your in-house team or the capabilities of a potential contractor, the above information will give an idea of some of the key concepts any wireless implementer should be well aware of.

The successful design of a wireless network is much more than that of course. Unless if you simply need 1-2 access points to provide coverage for a small office, make sure for larger networks you do get a professional to do it for you.

Procurement Skills

It’s a common saying in the procurement industry: “What procurement heads want most from their staff is interpersonal skills”. But is this just an unhelpful cliché? To find out, Procurement Leaders recently launched the Leadership in Procurement report, a piece of research that projects what is expected of procurement leaders and the modern procurement function. In the report, strategic thinking, commercial awareness, and leadership skills came up as the biggest skills gaps that the procurement industry is facing today. These are not technical skills, but all soft interpersonal skills. In fact, according to many procurement professionals, the largest leadership skill gaps lie in being a visionary, the ability to build a team culture, communication, and openness to change.

It appears that the industry has been focusing too much on developing technical skills such as negotiation, spend analysis, and contract management, but have not invested enough in the people element of it. We often forget that a large portion of the procurement process lies in the ability to persuade, challenge, and influence for the better of the business. For that to be effective, procurement departments need to build confidence to communicate and engage with other parts of the business.

Procurement personnel need to take input and go out to the marketplace to sell their company and excite the suppliers. Procurement has to get ideas from the suppliers to sell back to their own marketing team – putting them in an internal consultancy and salesperson role, where Procurement personnel need to be dynamic and always up to date on the latest technologies.

With the new criteria needed for procurement skills shifting, below are the skill sets that Procurement personnel need to be very good at in order of priority:

1) Relationship Management

The ability to leverage interpersonal skills to establish rapport and develop relationships with all key stakeholders, suppliers, customers, & colleagues.

2) Negotiation Skills

The ability to persuade, influence, and explore positions and alternatives to reach outcomes that will gain acceptance of all parties and will also meet the organization’s strategic procurement objectives.

3) Professionalism

The ability to think carefully about the likely effects on others of words, actions, appearance, and mode of behavior. The consummate professional selects the words and actions most likely to have the desired effect on the group or individual in question.

4) Results Focused

The ability and drive for achieving and surpassing targets against internal or external standards of excellence. This is about showing a passion for improving the delivery of services with a commitment to continuous improvement in your procurement process.

5) Technology Aptitude

The ability to apply and improve in-depth specialized knowledge, skills, and judgment by assessing and translating information technology into responsive and effective procurement solutions.

6) Financial Acumen

The ability to apply a broad understanding of financial management principles and other quantitative information to ensure decisions are fiscally responsible and based on the procurement budget.

7) Strategic Industry Management

Establishing long-range business plans which can anticipate the global market. This is particularly important for commodity procurement.

8) Category Management

Categorizing the spending according to specific goods or services (direct and indirect) and keeping in mind quality, service, risk and cost.

9) Project Management

Driving the procurement process by designing, implementing and managing projects to a successful conclusion. Establishing accountability, establishing timelines and establishing goals are paramount.

10) Analytical Skills

The ability to articulate and solve both complex and uncomplicated problems and concepts and make decisions that make sense based on all available information. Particularly important in the selection of vendors.

What is Virtual CTO – Who needs one?

A Chief Technology Officer (CTO), as the name suggests, is a key executive role inside a company or organization responsible for understanding the business drivers and align the technologies that are required for meeting the business objectives. Considering the speed of technology advances, it is extremely important (if not vital) to constantly adapt the business with these advances in order to keep up with the competition which makes the role of CTO critical for any modern business.

However, despite the importance of the CTO position for any organization, many Small and Medium Businesses (SMBs) simply can’t afford the overhead of a full-time CTO or simply are not of the size that would require a full-time CTO.

So is there a solution? Of course – Virtual CTO!

What is Virtual CTO?

Like so many other examples (such as Cloud Computing), the advance of communications has created many business models and facilities where you only have to pay for a service as much as you need it. And Virtual CTO is exactly that.

Just like in a cloud server where you don’t need to pay for the costs of a dedicated server, a Virtual CTO is actually a consulting service provided by some service providers that fulfills the exact role of a CTO when required for any size of a business at very competitive prices.

This is an ideal service for any SMB, enabling these companies to keep up with the technology and make the best out of it.

What services should you expect from a Virtual CTO Service?

Like almost anything else in this world, you have a range of good and bad Virtual CTO services offered at different costs. The services you should expect from a good Virtual CTO would include:

  • Carefully analyzing your business models and your available resources and infrastructure and giving you a clear analysis of your system efficiency.
  • Providing consultation and suggestions on how your organization can utilize new technologies to increase outcome and improve efficiency.
  • Enabling you to take informed technology decisions and properly manage technology within your business.
  • Providing unbiased technical consultation for your business problems.
  • Ensuring you would receive the highest ROI (return of investment) for all your technology investments.
  • Helping you in saving on costs by utilizing the right technology.
  • Managing your vendor relationships and negotiations for best interest of your company in new purchases.
  • Ensuring correct software solutions are designed and developed.
  • Managing your technology projects.

What are the benefits of a Virtual CTO for SMBs?

I would highly encourage a Virtual CTO service to SMBs because:

  • It can maximize the ROI (return of investment) in IT systems, saving great time and effort without the need to pay for a full-time position.
  • It can provide you the combination of knowledge and expertise of a full team of experts, rather than depending on a one-person’s ideas and knowledge.
  • It enables you to focus on your core business, while experts fulfill this important aspect of your business.
  • There are many ways that by utilizing the correct IT technologies you are boost your performance, save costs and stay ahead of your competitors – you simply can’t take the risk of missing the opportunity.

What is Cloud Computing? A short, simple explanation

The term “Could Computing” is heard and read everywhere these days, and many companies are constantly wondering what exactly it is, and whether they should apply for it or not.

While cloud computing can be discussed from many aspects (for example your Google search can also technically be called cloud computing), I’ll try in this article to give a very simple and short explanation of what this is all about as far as companies are concerned.

Very simply put, cloud computing means storing and accessing applications and data over the internet instead of the hard disks of local servers or computers.

The whole idea behind cloud computing is that instead of businesses needing to handle their own servers and data storage devices locally (which would mean purchase of hardware and software, upgrading and maintaining them and handling their security), they rent the needed services from companies like Amazon or Apple who provide these resources as a service over the internet on a pay-as-you-go basis.

How does Cloud Computing work?

Companies that provide cloud computing services actually host data centers with multiple servers interconnected to each other, and utilize special virtualization software to create a large computing and storage resource that can be divided into virtual resources which are rented to users and clients as a service.

What are benefits of Cloud Computing?

The main users of cloud computing services are SMBs (Small-medium businesses) as it enables them to quickly setup the computing resources they need and pay for only what they use.

The main benefits of cloud computing can be summarized as below:

  • Quick and easy setup: instead of needing experts to configure your local servers and installing the needed applications, you can simply set up the cloud services and resources by going through a few web pages that guide you step by step on how to setup your needed computing resources.
  • Elasticity: You won’t need to worry about upgrading your hardware when your business grows – this can be simply done at any stage you see a need for it.
  • Pay for your use: Instead of investing on hardware and software, their upgrade, and maintaining all that, you simply pay for the services as much as you need and use them.
  • High Accessibility: Most cloud computing services allow you to access your data and applications from anywhere on the internet and by using any connected device such as your tablets or smartphones.

What Services are provided by Cloud Computing?

Services that are provided by cloud computing are divided into 3 sectors:

  • Infrastructure as a Service (IaaS): This means buying access to raw computing hardware such as servers or storage over the internet. You pay for these resources per amount of usage instead of purchasing the hardware.
  • Software as a Service (SaaS): This means using applications that are hosted on the internet. You pay for the software per usage instead of purchasing the software and host it yourself.
  • Platform as a Service (PaaS): This means you develop applications using Web-based tools so they run on systems software and hardware provided by others.

Drawbacks

Cloud computing would come with many benefits for SMB companies. However, it also has its drawbacks which I’ve listed below:

  • It fully depends on highly reliable internet: Although this is not an issue these days for most locations, it can be a challenge when you are working in remote/undeveloped locations where you simply don’t have reliable, high-speed internet access.
  • Higher Operation Costs: Cloud services are charged either per user/time or per amount of resource used. While this is a benefit for many SMBs, it can make a cloud solution a costly solution for larger companies, when compared to setting up your own servers and services.
  • Greater Dependency on Service Providers: In cloud computing you are to a big extent dependent on the services provided by the hosting company. If they decide to drop a service, it can cause a big issue for your business.

Conclusion

Cloud computing is rapidly growing and will continue to grow. The issue is not whether you will use cloud computing or not – you are already using cloud computing in many ways if you are using the internet.

But as far as using cloud computing vs setting up your own servers are concerned, it would highly depend on the size of your business: if you are an SMB and your business is not “IT”, then a cloud solution would be most probably the correct choice. For large businesses, you might want to have a second calculation of overall costs before you decide.

How to Properly Maintain Your Digital Projector

Digital projectors are delicate and expensive, and they should be properly maintained and taken care of. Taking proper care of the projector can greatly prolong its lifespan and can make sure that it will always perform at its optimal level. Performing regular maintenance on all the various parts of the projector will ensure that the projector will always display the best quality image possible to the audience. Certain precautions should also be taken when operating the projector on a daily basis in order to improve its lifespan.

Listed below are the steps to take to extend your digital projector’s life:

Correct Locations for the Projector

Thoroughly read the manufacturer’s manual recommendation to know where the proper places and locations to install the unit are. The rule of thumb to follow is to have at least two feet of free space around the projector to allow for ventilation and heat dissipation. Never place the projector directly in sunlight or directly next to a heatsource. If the projector is mounted on the ceiling, make sure that the projector is not directly next to a heat vent or an air conditioning unit, and do not use the projector in smoke-filled rooms as this greatly increases the chance of damaging the projector optics.

Digital projectors produce large amounts of heat during their operation by themselves, and the heat must be channeled away from the projector to prevent overheating and avoid projector malfunction. It is especially important for the life of the projector lamp that this heat is removed effectively since projector overheating is the primary cause of projector lamp failure.

Proper Storing and Transporting

When unpacking a new projector, make sure that the box and packaging that the projector came in is stored away safely. If it becomes necessary to transport the projector to another location, storing it in anything other than the original box and packaging means that there is a higher risk of damage during transportation. Original packaging have protective styrofoam molded to the exact shape of the projector, and this prevents the projectors from moving while in transit. When transporting the projector, never leave it in environments with extreme temperatures.

Regular Projector Maintenance: Protecting Your Investment

Digital projectors have filters, which is where all the dust particles removed from the air end up. To ensure that the filters keep performing well, they must be maintained regularly. This is very important since damage to the projector caused by dirty filters may not be covered by the projector warranty. Generally, digital projector filters should be cleaned once every 100 to 300 hours, and the manual should be checked to find out what the recommended filter cleaning cycle specific to your specific projector is.

Hiring 3rd Party Professionals for Projector Maintenance

More than 80% of projectors and 90% of projector lamps that fail could have had an extended life span through regular cleaning. Periodic cleaning is required in order to extend the projector’s life and maintain optimum performance. To ensure periodic maintenance is carried out properly, many companies hire 3rd party professionals that maintain digital projectors. These cleaning professionals provide extensive design, installation, monitoring, and maintenance services to deliver and support high-impact digital display systems. They have years of experience in the field, and provide cleaning and maintenance solutions for multiple digital projector systems.

Top 5 Ideas for IOT That Could Change your Life

As I explained in my previous post “What is IOT? A short, simple explanation”, the “Internet of Things” or IOT-related technologies are booming at an unpredictable pace. There are literally hundreds of thousands of new ideas on how we can benefit from the IOT concept and the list is expanding on an hourly basis.

In this post, I’ve tried to share the top 5 IOT ideas that can considerably change the way we live in the next few years, noting that the technology for all the ideas is already available:

1) Integration of Virtual Reality Glasses and IOT

Virtual Reality is probably one of the most hot technology topics that is expanding at an extraordinary pace these days. Now imagine that you are wearing your new, nice-looking VR glasses that include a small speaker-mike and is also connected to the internet through IOT – you will have all kind of possible facilities:

  • You are automatically informed of emergency messages and can quickly read the ones you decide to read or have them read for you.
  • When walking in the street, the people you know are highlighted in the crowd (being detected by their mobiles or IOT devices)
  • You can speak to the VR headset asking for the closest drugstore and the directions are shown in front of your eyes and when the store is in your eyesight, it is specifically highlighted.
  • You can have your shopping list shared and as you pass by each store, whatever is in your shopping list that is available in the store is listed with their prices.
  • And an endless list of other features and possibilities…

2) Medical Wearables

Imagine that you and your family have smart watches that also monitor each person’s location, movement, health status, and all are connected to the Internet:

  • In case of an abnormal health condition, the watch automatically alerts you and suggests the primary medical recommendations.
  • If required, concerned people are automatically informed of the survival emergency, or an ambulance is called to the location while constantly sharing the health status of the patient.
  • If a child goes out of the defined geo-boundaries, parents are automatically alerted.
  • If a grandparent is sensed to have not moved for an unreasonably long period, or is in an unexpected location, an alert is sent to concerned people, specifying the exact location.
  • If you are under medication monitoring and your heartbeat goes higher than anticipated, your doctor is automatically informed and your health condition is shared.

3) Intelligent Home Appliances

Intelligent home appliances that are internet connected are definitely one of the interesting usages of IoT in very near future. Some examples are:

  • Your fridge checks the contents (identified through RFID tags placed to each food package) against a list defined by you once a day and automatically orders for the shortage through your selected shop.
  • Your coffee maker automatically prepares the coffee by sensing your approach to home through your smart watch location.
  • Your music station automatically continues playing the music you were listening to while driving in your car, from the point you took out of the car to enter your home.
  • Your TV set automatically senses when you are present watching the TV and would inform you of any emergency messages you might have received on your phone or through email.

4) Smart Vehicles

Smart vehicles are already in their final stages of development by most well-known car manufacturers. Some ideas you can find very commonly used in the next few years could be:

  • While you drive, you would be informed of best paths to reach destination, of any nearby car accidents, high traffic situations, road conditions, or danger zones.
  • Your vehicle would talk to you as your assistant and would follow instructions such as answering an email, to reserve a restaurant or flight ticket, to connect a phone conversation or automatically answer is phone call.
  • After you get out of the vehicle, your vehicle would automatically park itself in the free available location in the parking lot, sending the parked location to your mobile.
  • It would automatically alert you on any possible accidents or even take control preventing one.
  • It would brief you on the closest hotels or restaurants and would lead you to the one you select.
  • It would sense if you are falling sleep or not conscious enough to drive and would recommend a rest or would enforce it if necessary.

5) Automated Homes

IOT would certainly also change the future of our homes in the next few years. Almost all parts of the home, from lights, doors, windows, air conditioning systems, and even trash bins are expected to be part of IOT. The homes would be also expected to be equipped with intelligent CCTV camera systems.

This would enable many facilities that might not be currently easily imaginable – just as a few examples:

  • The home’s lighting and temperature is automatically adjusted based on tracking the locations of people and each individual’s habits.
  • Doors are automatically opened for approved personnel with no need for a key while preventing access to unauthorized people.
  • Kids’ movements are intelligently tracked and an alert is sent to parents if needed.
  • An emergency number is called in case of any accidents or emergencies happening in the home.
  • Your mirror automatically reminds you of the agenda for the day after sensing who is standing in front of it every morning.
  • Trash collection is automatically requested once the trash bin is full.

Conclusion

Above are only a few ideas that are mostly already available or in the final stages of development. As mentioned before, we are entering a new arena of internet technology based on the IOT concept which will probably prove to be the turning point of technological revolution that started with invention of the internet and had a major breakthrough in the presentation of mobile devices and cloud computing. Are you ready for it?

What is IOT? – A short and simple explanation

The Internet of Things or “IoT” is probably one of the hottest technical topics of 2016. Although the concept is not new and goes back to the 1980s and 1990s, it is right now that it is really gaining momentum and can become one of the fastest growing businesses in the next 5 years (predicted up to trillions of USD).

There is a huge amount of posts, articles, and white papers on the internet, trying to explain what IOT is, what possibilities it would create, and what its challenges are.

In this post, I’ve tried to explain the topics as simply as possible for those who want to get an idea of the concept in a very short time and without all the fuss.

What is IOT?

IOT is about connecting “Things” to the “Internet”. This connection can be wired or wireless.

The term “Things” is wisely chosen as it can include literally everything: from obvious things such as computers and smartphones that are already connected to Internet, to home appliances, wearables, vehicles, factory machines, to tagged animals and consumables.

What are the key components of IOT?

The key components of IoT can be summarized as follows:

  • Sensors: sensors enable us to collect data about the status of the “Thing”. Sensors are probably the most important components of IOT and can include data such as temperature, GPS location, speed, and all other usable data about the “Thing”.
  • Controllers: IoT is not just about collecting data of the status of things. It can also include controlling the “Thing” over the internet – such as turning off or on a device, stopping a vehicle, locking/unlocking a door, adjusting the temperature of an oven and any other controllable aspect of the “Thing”.
  • Software: apart from the required hardware that should be embedded to every “Thing” that is connected to internet as part of “IoT”, probably the most exciting part of IoT is the software. Once you know the devices that can be sensed and controlled over the internet, you have the needed tools for endless ideas and creativity through the applications that can be developed to provide automated or semi-automated solutions based on human-device and device-device communications.

What “Things” are expected to be part of IoT?

It is very hard to predict what IoT will cover in the next 5 years – the concept is exploding with new ideas every day. Some ideas include:

  • Home appliances: fridges, cookers, coffee makers, heaters, HVAC, TVs, DVD players, lights, doors, windows …
  • Wearables: clothes, shoes, hats, watches, heart monitors …
  • Vehicles: cars, buses, bicycles, trains …
  • Factories: machines, robots, warehouse shelves, parts within machines, tools …
  • Agriculture: biochip transponders on farm animals and plants, farm humidity and temperature sensors …
  • Food: sensors for monitoring the condition of food.

Challenges and Risks

Without a doubt, the most critical challenge and risk of IoT is its security, and how much it would be immune against cyber-attacks, hackers, and unauthorized intruders. With billions of “Things” connected to the internet, it would also mean that by unauthorized access one can create disasters that we’ve already watched in so many science fiction movies.

6 Reasons for Replacing your Excel Files with CRM

Many companies use different forms and lists created in MS Word or MS Excel (or their equivalents) for keeping their key information such as list of clients, their sales opportunities and prospects, and similar data and then keep these files updated and shared through email. On the other hand, there are lots of talk about utilizing Customer Relations Manager (CRM) solutions, which to many would sound like a fancy tool used by computer gurus and not something suitable for a non-IT business.

The key question especially for small businesses would be: Why should they replace the Excel forms they are used to and have been using for years – with a CRM solution that seems to be so complicated?

In this post, I will very simply answer this key question.

What is CRM?

Simply put, CRM is an online application that allows multiple users access, share, update, and process the data that is somehow related to handling customers and clients. Many CRMs do much more than this by providing communication and automatic workflows defined and fine-tuned for each specific business.

6 Reasons for Replacing your Excel Files with CRM

Below are 6 simple reasons why you should really start replacing your Excel files with a CRM:

1- Sharing a single source of updated information

Although you can also share an Excel file with a team, there are obvious challenges in maintaining the updated file – there is always challenges such as “who has the latest updated version of the file” and “how to merge the info when two team members make changes on their own copies”! Using a CRM will put an end to these and many other issues by providing a single source of information which is always up-to-date.

2- Keeping track of all customer interactions:

It would be extremely complicated and inconvenient to have the track history of all interactions with every single customer kept in an Excel file: you would either need to create a sheet per client or to simply add details in a comment cell in front of each client. CRM however allows you to keep complete track record of all customer interactions such as meetings, emails, quotes, phone calls, and easily access these track records.

3- Integration with Email

One of the main methods of communicating with the clients is through emails. In non-CRM solutions, most of most critical information would be not in the Excel files but in emails of individuals interacting with each customer. Other team members usually have no insight on these very valuable historic information and even worse, when an employee leaves – these valuable information are also usually lost or would be very hard to retrieve. In a properly setup CRM, email communication with each client is done through the CRM and kept as part of the tracking of that customer. Hence it would be very easy for anyone else within the company to have access to all these very valuable information.

4- Reporting features

Although Microsoft Excel also has many powerful reporting and data summarizing features, CRM is extremely powerful in this regard. Multiple tabular, summarized, or chart reports can be automatically created and even shared with company management through automatic emails. Many CRMs also provide data analysis reports which would provide incredibly useful tool for company’s success.

5- Multiple levels of accessing the data

It is very common that you would want to provide different levels of information about each client for each level in your organization. In non-CRM solutions, that would mean creating multiple copies of the same info in different Excel files which would be extremely inefficient and time consuming. Such configuration can be easily done in a CRM.

6- Automation

By automating workflows and most common tasks in a CRM, you can save immense time and money – this is probably the most important benefit of starting using a CRM solution which you would better realize when you start using it.

Challenges of Setting up CRMs

Setting up CRMs can be quite challenging if you try to do it on your own. It would need a combination of technical and business expertise and companies who try to do it themselves usually fail because of depending on someone who simply is not qualified.

To have a successful CRM solution, it is highly recommended that you do this through a consulting company who can:

  • Analyze your business needs
  • Propose the best CRM solution among hundreds of different available solutions
  • Tailor the CRM to your specific business needs
  • Transfer the old data to the CRM
  • Most importantly, mentor your staff to effectively use CRM in their daily work

5 Reasons for Outsourcing your IT Services – Does it also save costs?

Many companies argue whether outsourcing their IT services would save them any money or if it would actually increase their costs vs having an in-house IT engineer handle their IT requirements.

Actually, I’ve seen many company owners who decide to go for a single in-house IT engineer because of the belief it would cut costs, or because they feel more confident and in control.

So you might be interested to know that the facts are actually the opposite – one of the main reasons (or probably the main reason) for IT outsourcing is for the sake of costs savings. This is so important that not just small companies, but many large enterprises have implemented the IT outsourcing model as a factor of costs saving.

But apart from costs savings, there are also other important reasons for why I would recommend IT outsourcing specially for the small companies and startups.

5 Reasons for Outsourcing

Among many benefits one can list, I think the 5 below are the main reasons I would recommend outsourcing IT services:

  1. Saving Money Saving money is probably the most important reason for IT outsourcing. IT outsourcing will eliminate the fixed costs of hiring employees, training, employment insurance and taxes, and a lot more.
  2. Receiving Professional Services Nowadays it is almost impossible to find a single employee who is a master in all the different fields of IT that a company needs such as LAN infrastructure, setup and management of servers, VOIP, Video Conferencing, user support, etc. In my experience, companies who do hire their own in-house IT staff fall into a trap where their IT engineer gives them the wrong technical solution, simply because he/she is not a subject-matter-expert in all the fields that the company expects him/her to be. Outsourcing IT services will ensure that the company would have access to a pool of resources when and as needed who are experts in their own fields.
  3. Keeping up to date with new technologies It is extremely hard to expect that a single IT employee can keep up to date with all new technology advances. This will prevent companies that depend on their internal IT resources from benefiting on new technologies which could effectively save costs. This is not the case with companies who provide IT outsourcing services – they always keep themselves up to date with new technologies to stay in the competition.
  4. Improving Company Focus For companies who depend on their local IT resources, company management would inevitably find themselves dragged into making decisions on IT expenditures and figuring out what solution is the right one to implement, or how accurate the estimations or technical evaluation of their IT employee are. I’ve even seen top management of companies getting engaged in solving basic IT problems that has ultimately halted operations. By outsourcing IT requirements, companies can focus on their core business.
  5. Flexibility By outsourcing, it would be very easy to demand resources when needed, without worrying about the workload of in-house IT resources.

Risks of Outsourcing

Like any other “good” service, IT outsourcing also has its own risks. The most important risk that you would need to be careful of is that the company you select for outsourcing does not “lock” you to themselves. I would strongly recommend you make sure you include in your agreement that you receive a copy of all network drawings, equipment login credentials, and details of configurations. I would also recommend not going with any customized or proprietary software that would lock you up with a specific service provider. By following such hints, you can always switch to another IT outsourcing company when needed.

Professional Digital Solutions for Cinemas and Theaters

Digital Cinema Projection (also called Digital Cinema) is a method of using digital copies of a movie stored in electronic devices such as high-capacity hard drives and servers instead of traditional reel-to-reel films. While the traditional method of playing movies in cinemas involves projecting light through a film to play the movie, digital cinema utilizes technologies such as Digital Light Processing and Liquid Crystal on Silicon technologies, and the movie is projected to the screen using a Digital Projector instead of a conventional film projector.

Advantages of Digital Cinema

Distribution

Unlike the traditional method of distributing a movie film reel to each cinema, digital cinema movies can be distributed using hard drives, the internet, dedicated satellite links, or optical disks such as Blu-ray. Movies are supplied to the theatre as a digital file called a Digital Cinema Package, which for typical films will be between 90GB and 300GB in size. Distribution of digital cinema is simple, fast, and inexpensive compared to the time and cost incurred for the shipping and handling of heavy film tapes.

Quality

When comparing the average film presentation to the average digital cinema presentation, digital cinema presentations are usually equal in picture quality and have more stable images than film. Digital cameras are often highly configurable and use detachable modular components for flexibility and upgradeability. They can also record high-resolution images up to 4096 x 2304 pixels. And with the popularity of 3D movies today, digital cinema has an advantage since digital 3D uses polarization instead of the colored glasses used by older 3D cinemas to portray the 3D effect. This ensures that the color of the finished image is not corrupted. Digital cameras are also better for indoor shooting and shooting at night with a very low light.

Durability

Digital presentations don’t get scratched like film, and they don’t fade or suffer the problems that film experiences, especially after being played for an extended time. The picture and sound quality of digital presentations last indefinitely, unlike for film which eventually degrades.

Ease of Operation

Theaters that have digital cinema technology installed do not have to worry about building physical film prints and moving them from screen to screen to play a movie. With digital cinema, the shows can be queued along with the trailers and advertisements, without changing film reels that traditional film requires. Digital cinemas can be shown and managed in the theatres with minimal training as the management terminal is PC-based and simple to handle, unlike analog film that needs dedicated personnel for receiving, prepping, showing, dismantling and returning the movie tape films. Because of this ease of operation and superior image quality, digital cinema presentations dramatically improve the movie experience of the audience.

Digital cinema technology is still in its infancy compared to traditional film, and like all new technologies it is initially expensive. However, the advantages of digital cinema over film are encouraging theaters to invest in digital projection equipment in order to let their audience have an amazing movie-going experience.

Digital Cinema Projectors

Movie projectors designed for theaters such as the NEC NC1100L DLP cinema projector deliver an enhanced theater experience with pristine images. These projectors are compact and lightweight laser-based 2K DCI-certified digital cinema projectors. Their small sizes enable them to be installed in small projection booths within the theater, or transported for mobile applications. Their built-in all-in-one Integrated Media Server with 2 Terabytes of storage offers versatile connectivity, while reducing the number of peripheral devices that are needed for operation. The NC1100L features 2K (2048 x 1080) resolution, 3D capabilities, and 3-chip DMD reflection method, and is easy to operate with user-friendly accessibility and up to 20,000 hours of laser life.

The features and capabilities of the NEC NC1100L DLP cinema projector include the following:

  • Laser light source ensures no black screen
  • Approximately 20,000 hours of expected usage of the laser and Digital Micromirror Device
  • 3D content utilizes the projector’s full 2K resolution and triple flash technology for smooth motion
  • Complete line of (6) bayonet-style lenses with electronic zoom, focus and lens shift and lens memory (4 lenses)
  • Direct selection buttons for eight stored projector configurations
  • Projector can be controlled from an optional touchscreen

Vizocom is providing Digital Cinema Projection Systems based on this technology to the United States Air Force for their Morale, Welfare, and Recreation programs.

How to Select the Correct CCTV Camera to Use?

There are literally thousands of camera models from hundreds of approved manufacturers available in the market, and this makes the selection of the most appropriate CCTV camera to use a very confusing issue.

In this article, I will try to explain very simply the different parameters you need to look into when selecting a CCTV camera for your project, as well as the required knowledge to determine if what you are being offered really matches what you expect or not.

Please note that I will be discussing IP Cameras here and not analog. As explained in my previous article “Analog CCTV vs IP Cameras – What’s the Correct Choice?” – analog CCTV is not recommended anymore.

Key parameters for selecting the proper type of CCTV camera:

1) Outdoor/Indoor: One of the parameters that can be easily filtered by available choices is whether the camera is for indoor or outdoor use. While it is essential to use an “outdoor” type camera for outdoor installations to provide IP65/IP66 weather protection, “indoor” cameras are less bulky and more cost effective for indoor installations.


2) Coverage Area / Target Distance: The second most important parameter in selecting the correct type of camera is to determine the coverage area and target distance for each camera in your system. Below are some of the key camera parameters that will be determined based on the coverage area / target distance:

  • Fixed vs PTZ: PTZ (Pan-Tilt-Zoom) cameras as the name suggests give the possibility to user to turn the camera view to any needed direction and to zoom on specific areas. They can be also pre-programmed to automatically scan specific routes. On the other hand, fixed cameras provide constant uninterrupted monitoring of a specific, fixed area like entrances and exits or perimeters.

Type of Lens: the type of lens determines the angle width and the distance that each camera can cover. As a rule of thumb, the wider the coverage angle is, the less distance is covered by the camera. Using a vari-focal lens enables you to adjust these two parameters of the camera in practice. There are also 180 degrees or 360 degrees fisheye cameras that are used indoors for providing a wider coverage.

180 Panoramic View

360 Fisheye View

3) Image Resolution: in IP cameras, this is determined by number of pixels (color dots) that each camera image consists of. The higher the image resolution, the more image details is captured and provided by the camera. At the same time, more storage capacity would be required for recording. The current common resolutions these days start from 720p HD (1280 x 720 pixels) and go up to 5MP (5 Megapixel or 2592 x 1944 pixels).

Camera Resolution Chart

Advantage of IP Cameras

4) Night Vision / IR: If you require capturing video in darkness, then you should look for cameras with day/night and IR (infrared) lighting configurations. IR cameras use infrared LEDs that are lit automatically when dark and enable the camera to capture black & white video in complete darkness. The coverage of cameras in the darkness is determined by the power of their IR light which is a parameter you should look into when selecting such cameras.

Security Cameras with Nightvision

5) Camera Housing: camera housing is also an important factor to consider when selecting your camera. Below are the main types:

  • Dome Cameras: Dome cameras are used both for indoors and outdoors for both fixed and PTZ cameras. They have a nice look and it is also hard to determine which direction the camera is pointing at.

  • Box Cameras: These cameras are also used both indoors and outdoors and are the standard type of security cameras we all have an image of in mind. The lens and direction of the cameras are clearly visible and clearly show everyone that the location is under CCTV surveillance.

  • Bullet Cameras: These are small, cylindrical type, waterproof housings that are usually used for outdoor cameras especially when you don’t want the cameras to capture much attention.

6) Vulnerability: Apart from indoor/outdoor type of cameras, you might also want to select “vandal-resistant” cameras that come with very-hard-to-break glass covers to protect the camera against vandalism. There are also Explosion-proof cameras that are extremely expensive and protect the camera against explosions.

7) Other features: In addition to above main parameters, new IP cameras also come with a constantly-expanding list of new features and enhancements – these include video analytic and enhancement features, web interface for direct view and remote monitoring and control, automatic alert notifications via email and SMS, and even internal NVR for recording of videos. So these are also the parameters you might want to check into.

Please note while I’ve tried to focus only on the key factors, it is also important to emphasize that designing a professional CCTV solution still requires a high level of expertise and experience and it is always worth it to have a professional company involved in designing the proper CCTV solution tailored for your needs.

V-Shape: Time to get your data center into shape

Today, organizations are challenged with unpredictable and explosive data growth, while still depending on applications and services running in silo IT configurations. IT effectiveness is increasingly affected by the expansion of underutilized hardware, isolated management tools, and high demand on resourcing operations. The inefficiency associated with IT complexity affects small, midsize, and enterprise organizations alike. Some businesses have already begun to move away from inflexible IT silos towards a shared, virtualized IT architecture that helps improve responsiveness, lower costs, and drive business innovation. Organizations with limited resources and tight IT budgets need an affordable solution that can help them build an IT infrastructure that is simple to deploy and manage, flexible enough to operate with their current needs, and can be easily scaled for future growth.

Common challenges surrounding virtualization:

  1. Level of IT skills competency
  2. Myriad of technologies
  3. Skilled resource required for designing, implementing and integrating virtualization

The advantages of vShape at a glance:

  1. Reduce your investment and operating costs and improve your control over your IT infrastructure, while still maintaining flexibility in your choice of operating systems, applications, and hardware
  2. Focus on business critical projects instead of spending too much time on resource intensive tasks
  3. Use your existing IT resources more efficiently and reduce your data center investment costs
  4. Reduce your energy consumption and space requirements and the associated costs

What is VoIP, and what are its Main Benefits?

The term VoIP is widely used in any communication and technical discussions these days. Some identify it as fancy and expensive telephones, and some understand it as very cheap international calls over the internet.

So I’ll try to very simply give a quick explanation on what VoIP is, and what the benefits are.

VoIP stands for “Voice over Internet Protocol” – so in the most general definition, it is the term used for the technology used to transfer voice over IP (i.e. computer network including the internet).

Practical VoIP Services

In practice, VoIP covers a broad range of different telephony technologies / services. The main ones are:

  • Free phone calls over the internet: these include the services where you use your computer / internet enabled mobile phone to do a voice call with someone else over the internet using the same services. Probably the most popular example is Skype. Such communications are free as you would only pay for the internet services in general. Apart from Skype, there are now many other similar services available that you can select.
  • Professional VoIP telephone sets and PBXs: old analog office telephones and intercoms are now almost totally replaced with VoIP-based equivalents. Some of the most famous names are Cisco and Avaya which are on the higher-end. However there are also many lower cost solutions and free VoIP PBX software such as Asterisk that can be used to set up very advanced business solutions for offices.
  • VoIP for International Calls: Another service that is used as a standalone service or in conjunction with above solutions is using VoIP for making international calls. Unlike traditional POTS (Plain Old Telephone System) services that are very expensive for making international calls, using these services can save up to 90% on costs for international calls. The reason is that while in POTS a dedicate connection is made from your location right up to destination, in VoIP, your connection up to the distant location is provided over the internet at no extra costs up to the destination city, and you would only pay for the call as if it has been a local call.

6 Main Benefits of VoIP

The 6 main benefits  are:

  1. Major costs saving in both implementation and in call rates.
  2. Easy integration with other business systems and communication systems.
  3. Lots of highly useful features such as CallerID, follow-me, find me, auto-attendant, call lists, multiple numbers, voice mails, etc.
  4. High portability and can seamlessly interconnect remote sites.
  5. Higher voice quality, immune from interferences.
  6. Lower maintenance costs

Conclusion

VoIP is very rapidly replacing old analog solutions. We can now confidently say that it is for sure the sole future of telephony solutions.

IPTV VS Analog MATV – What’s the correct choice for distributed TV entertainment?

One of the key requirements in every hospitality infrastructure (being it a hotel or a man-camp for a remote oil and gas site) is to provide the guests with a selection of TV entertainment channels in each room.

These solutions usually consist of a “head end” system where satellite and/or terrestrial TV channels are captured and then distributed over the TV network to each TV installed in the rooms.

Up until only a few years ago, the only well-known solution was MATV (Master Antenna TV also known as SMATV – Satellite Master Antenna TV), where the TV channels were “modulated” at the head end over different TV channel frequencies and then distributed over a coaxial distribution network.

Recently however, there is a growing demand for IPTV solutions. As the name suggests, an IPTV system depends on LAN infrastructure (computer network) to fulfil the same function.

Benefits of IPTV over MATV/SMATV

IP-based technologies are for sure the future of everything and the same is true here! There are many benefits of an IPTV system over an MATV/SMATV – some of the most important ones are:

  1. Higher picture quality: an analog MATV system is highly susceptible to poor image quality and distortions like ghost images and snowy pictures, while in a well-implemented IPTV system, there is zero downgrade of picture quality.
  2. Unified communication medium: if there is a proper LAN infrastructure in place or if we’re talking about a complete new installation, IPTV would not need separate cabling apart from the LAN infrastructure while for MATV, separate coaxial cabling would be required.
  3. More channels: while in an analog MATV the total number of channels that can be broadcasted is limited to 80, in an IPTV system there is no such limitation as long as the proper LAN infrastructure is in place.
  4. Interactivity: unlike analog MATV which is a one-way system, IPTV solutions provide 2-way interaction giving access to great features such as internet browsing, Video on Demand (VoD) and customized hospitality features.
  5. Wide distributions: as IPTV is based on IP technology, the solution can be distributed over wide areas in such a way that there are now many providers who offer IPTV services over the internet.

Challenges of IPTV

Based on the above benefits of an IPTV system, should you go ahead and replace your existing SMATV tomorrow? Well, there are key challenges that might make this a difficult decision to make:

  1. Need for a good LAN infrastructure: IPTV systems can’t be implemented just over ANY existing data network. You need to make sure the existing network can actually support the required bandwidth for the IPTV multicast packets and that the network switches do support features such as IGMP Snooping. If not, you end up with a flooded data network and totally unusable, jittered TV images.
  2. High Equipment Costs: Although like all such new technologies, the price of IPTV equipment is sharply dropping every year, the extra costs can still be a decisive factor here.

The other alternative: DVB MATV

In cases where there is a lack of a strong LAN infrastructure, replacing poor quality SMATV systems with IPTV would not seem cost effective. However, there is still another alternative to consider: DVB MATV (Digital Video Broadcast). These systems also work over the same coaxial network used by MATV, but broadcast the channels in DVB (digital) format so they would be a mid-way but cost effective solution to considerably enhance the picture quality of an old MATV system without the need to change the cabling structure. (This is considerable for the renovation of entertainment systems for large hotels where the re-cabling of the whole building would be too much trouble).

Future of TV Entertainment

With no doubt, analog MATV systems are rapidly becoming outdated and replaced by IPTV systems. It would be up to the current MATV owners to decide the “right time” for the swap!

Analog CCTV vs IP Cameras – What’s the Correct Choice?

If you search on the internet, you can find many disputes about Analog CCTV vs Digital IP Cameras. The main question is this: Are IP Cameras the sole players in the future of CCTV, or are there still some good reasons for implementing analog cameras?

oday, I see many CCTV designs proposed for buildings that are based on analog CCTV solutions. Unlike some designers who still propose analog cameras, I can undoubtedly confirm that the future of CCTV surveillance solutions is with IP-based systems. Nevertheless there are still rare cases where one might decide to propose an analog camera – I’ll give a hint on those as well at the end of this article.

In this article, I want to quickly go over this topic and provide an easy-to-understand explanation.

The key difference between analog CCTV and IP Cameras

Without going into too much of technical explanations, the two systems can be quickly defined as below:

  • Analog cameras transfer the video signals in analog form (electrical signals), usually use coaxial cables for the cabling, and have the videos recorded by a DVR (Digital Video Recorder), where each single camera is directly connected to the DVR.
  • IP cameras encode the video signal into IP packets, use the data network (LAN) for the cabling, and have the videos recorded by an NVR (Network Video Recorder) that can be connected anywhere on the network.

Both type of cameras use the same mechanism for capturing the video by their CCD sensor, and the main difference is the method by which the video signal is transmitted.

Benefits of IP Cameras over Analog CCTV

  1. Higher image quality: Unlike a few years ago, where cameras have poor video resolution, now we have mega-pixel IP cameras that totally outmatch any analog camera solution. The higher pixel resolution of the IP cameras means you can zoom into much more details of a scene even after it is recorded, without losing clarity.
  2. Unified cabling infrastructure: by utilizing the same LAN network infrastructure, IP cameras can be deployed usually with no need for major re-cabling. It also enables utilizing different network mediums such as wireless and fiber links seamlessly.
  3. No major interference / distortion hassle: in analog systems, especially when the cameras are over a few hundred meters/feet away from the DVR, interference and distortion due to electrical noises, poor quality connections, and ground loop effects can cause tricky situations requiring extensive effort to overcome. With IP cameras, one won’t need to bother about interferences / image quality issues.
  4. Power arrangements: IP cameras can be mostly powered over the same network cable through POE (Power Over Ethernet) by simply connecting them to a POE-capable network switch, eliminating the need for separate source of power. This is not the case in analog cameras, where each camera would need separate power source.
  5. Easy management: IP cameras can be easily managed and controlled remotely. This considerably simplifies and speeds up troubleshooting of the system. One can easily check the connectivity of each individual camera over the network using a laptop with proper authentication, while in analog cameras physical attendance to each camera and using of separate monitoring tools is a must.
  6. Lots of extra features: New IP cameras come with a constantly-expanding list of new features and enhancements – these include video analytic and enhancement features, web interface for direct view and remote monitoring and control, automatic alert notifications via email and SMS and even internal NVR for recording of videos.

When can I still consider Analog Cameras?

With all technology enhancements, many of the arguments justifying analog cameras are not valid anymore and belong to the past. Arguments such as analog cameras have better image quality or costs less were valid a couple of years ago, but not anymore.

But there are two design conditions when one might still justify an analog camera solution:

  1. Very small systems for small shops: If you want a very simple and cost effective setup to include up to 4 cameras connected with a very short cables to a DVR to setup a basic surveillance for a small shop, analog cameras are probably still considerable for 1-2 more years.
  2. Distributed, distant cameras with no existing network infrastructure: There might be some rare cases where a simple surveillance solution is needed where there are a few cameras distributed in different directions and with several hundred meters/feet distance from the control room. In such cases, if there is no network infrastructure available, one might still consider an analog camera solution for the sake of lower costs of implementation.

Conclusion: After reading this article, if you see a designer proposing you a camera system based on “coaxial cables” – you can confidently conclude that you are in wrong hands!

Which Telephony Solution Costs Less to Implement – VoIP or analog POTS?

Telephones have always been and will continue to be a part of any office, hotel, or any industrial infrastructure. Today, there are still many design requirements where you see requests for analog phone systems, arguing cost saving reasons.

So the question would be: What is the most cost effective solution for telephony systems – Voice over IP (VoIP), or analog Plain Old Telephone Service (POTS)?

If you’re thinking “Of course VoIP”, I’m afraid that’s not a fully accurate answer – there are also exceptions!

Below I briefly explained which of the two costs lower to implement:

What are the situations where VoIP systems cost less to implement?

There are many situations where VoIP systems are simply the better option for telephony when it comes to saving costs.


1) Scattered and Large Infrastructures

The cabling costs for an analog or VoIP telephony solution on very small offices might not differ much, but the difference would be immense if we’re talking about large building infrastructures with hundreds of phone sets. In a VoIP setup, the same LAN network can be used for VoIP telephones with no need for expansion with a separate cabling infrastructure. However, in an analog telephone solution, you would need to lay copper cables from the PABX to each single point that requires a telephone connection, raising the need for expensive multi-pair copper cables for the telephony backbone.

2) Where Scalability is Demanded

In analog systems, a pair of twisted copper wires should connect each phone to the PABX. It means scalability would need to be designed from the beginning by using multi-pair cables with enough spare pairs to allow for future expansion. Without adequate spares for expansion, the laying of new cables would be simply unavoidable. In VoIP systems, there would be no need for new cabling to accommodate expansions in the network backbone.

3) When a High Number of Phones are Needed

Analog PABXs need to include individual circuits for every extension, and as the number of extensions increase, they simply are much more expensive than VoIP PABXs. VoIP PABXs can be as small as a 1U rack server with no need for separate electronics.

VoIP systems of course come with many extra benefits in addition to costs savings – such as a long list of features not available on analog systems, full integration with computer and data networks, much easier maintenance, software upgradability, and much more.

Therefore except for very small offices with less than a handful of phones, I can’t think of any other scenario for proposing a full-analog POTS PABX anymore. But this is not the end of story!

What are the situations where analog systems cost less to implement?

Don’t be surprised! There are still many situations where an analog system can be a more appropriate solution for at least part of the telephony system of an infrastructure.


1) Distant Locations with no Means of Electricity

VoIP phones need to be powered on to work! This is usually done by getting the power from a POE (Power Over Ethernet) network switch over the same network cable that connects the phone to the switch. However, this limits the distance to no more than 100 meters or 300 feet. But in analog POTS phones, the power to the phone is provided over the twisted pair cable which can be extended for up 2 miles (3 km) or more. It is not a rare scenario in industrial infrastructures where the telephony system would need to be extended to some remote locations.

2) Hotlines and Emergency Phones

Analog phones are still commonly used in industries as hotlines / emergency phones – they are independent from any network infrastructure and hence would still work in cases of emergencies like power cuts and shutdown of electricity to the infrastructure network.

Both VoIP and analog each have their own strengths in specific situations, which allow you to save on costs when implementing the correct telephony system. Today, there are technologies that allow both VoIP and analog systems to interface with each other. FXS cards or ATA adapters are usually used to interface an analog phone system to the now very common VoIP PBXs.

The Challenges of Unified ELV Implementation

In my recent article 5 Reasons Why Integrated ELV Systems Reduce Costs, I tried to explain briefly how a well-designed integrated and IP-based unified ELV system considerably reduces construction costs.

However, like any other good thing, this comes with its own set of challenges to tackle!

Integrating complex ELV systems into a single LAN network requires high design experience and extremely good knowledge about integrated technologies.

The network designer who is responsible for the design of the unified ELV system must have an in-depth understanding of both the passive layer (cabling/containment) and the active layer (network switches and routers).

One of the main challenges in utilizing the same network for different systems is the calculation of the required bandwidth. This is especially important to ensure that audio/video related technologies such as CCTV (surveillance cameras), IPTV (TV system) and VoIP (telephone) – systems that highly depend on IP streaming techniques – are not jittered.

Some of the design considerations needed to achieve the best results are listed as follows:

  • Ensuring a proper network topology with correct IP plan that allows expandability
  • Ensuring the usage of correct network switches to achieve the needed bandwidth while controlling the costs
  • Determining the correct backbone media to use for future expansion
  • Ensuring proper network segregation by implementing Virtual LAN (VLAN) techniques
  • Implementing IGMP snooping for controlling IPTV multicasts
  • Ensuring backbone redundancy

Another challenge to tackle at the design level of a unified ELV system is selecting the right products to use. There are so many brands for so many systems that need to be integrated into a unified, IP-based solution – and due to the diversity of solutions, there is no single brand that even claims they can cover everything.

So it becomes absolutely critical to ensure that the right products from correct vendors are determined for each specific project, based on the project requirements as well as the priorities. It is also very important to ensure that the selected solutions can actually seamlessly blend into a single, unified solution.

5 Reasons Why Integrated ELV Systems Reduce Costs

In this article I want to quickly go over 5 reasons on why designing integrated ELV systems considerably reduces construction costs.

Unfortunately, many ELV designers still base their designs on traditional systems running on proprietary networks for various building management systems, as well as separate telephony, data and television networks each with their separate/multiple cabling systems.

This results to increased costs, limited functionality, and complex management.

The solution is a modern design based on integration of all ELV systems over an IP-based network.  This calls for a higher level of technical expertise and experience, and a good combination of networking and ELV knowledge. As you will see below, the results are higher efficiency at lower OPEX and CAPEX costs, and reduction in risks of delay:

1- Unified Cabling and Pathways

Unlike traditional solutions where each system would need to use its own cabling system and pathway (CAT6 for network, copper multi-core twisted pair for telephony, coaxial cable for TV systems, twisted cable for PAGA, control cable for ACS, …), in a modern IP-based ELV design, all systems mainly use the same common data network, hence considerably decreasing the costs on cabling and pathways.

Of course the installation of common cabling and pathways requires planning of the containment systems at the early stages of the project so that the optimum routes can be designed by experienced network designers.

2- Less Quality / Interference Problems

One of the key challenges of traditional analog designs especially for systems such as CCTV, telephony, and television is the complication during the design and installation to ensure that proper quality of audio and picture is obtained. Although things might look good on paper, in the course of installation and commissioning many unforeseen problems usually pop up, which cause further unexpected delays on project timelines and increase on forecasted costs.  Problems can arise such as grounding problems, quality issues of cables and connectors, and electrical and ground-loop noise which directly affect analog solutions. In IP based solutions, by implementing a digital/IP based backbone these issues are no longer a concern.

3- Lower Costs for Expandability

Most of traditional ELV systems are very limited on the available means of expandability over large compounds. For example, to expand a conventional analog telephony, CCTV, or public alert system over a medium-sized compound, kilometers or miles of cables need to be physically laid. This calls for expensive, multi-core copper cables that are both expensive and hard to lay. While on new IP based solutions, the network cloud consisting of all types of connection mediums such as much less expensive fiber cables or even microwave links can be used to seamlessly interconnect remote areas at a fraction of the costs of traditional solutions.

4- Easier Management and Troubleshooting

IP-based unified ELV solutions are by far easier to manage and troubleshoot, because the maintenance staff do not need to bother about multiple cabling systems and connections, and the cabling (called physical layer in data networks) is easily managed and checked. The overall needed experience and troubleshooting time required for maintaining the systems are also considerably lower and usually a single computer would be sufficient for managing and troubleshooting the all systems from a centralized location.

5- Capability for Remote Management

Unlike traditional analog solutions, IP-based systems can be remotely managed and reconfigured, with minimum physical changes required. This saves considerably on maintenance costs while bringing many new features and possibilities.

Other Benefits of Unified ELV Systems

Costs savings is not the only benefit of modern IP-based integrated ELV systems – there are more benefits such as:

  • Much more functionalities and features
  • Expandability with no need to redo the infrastructure
  • Software upgradability
  • Integration of different systems

In a future article I will explain the challenges of designing modern integrated ELV systems.

What is ELV after all?

According to the International Electrotechnical Commission, ELV (Extra Low Voltage) is defined as any system operating in a voltage not exceeding 35V AC (or 60V ripple free DC). Although the term is technically correct from “electrical” point of view, it by no means describes the broad range of systems and technologies which are known as ELV systems in buildings.

ELV is the terminology used in the construction world in an attempt to electrically define all the systems in a building which need electricity to run but are not part of the building’s main electrical system. ELV covers all the new modern technologies that are increasingly becoming must-have systems in every building such as data network, CCTV, fire alarm systems, public address systems, audio/video solutions, access control and intrusion detection systems, home automation, and much more!

The fact that such a broad range of technologies are collectively named “ELV” shows probably how much we are lagging behind the new demands of 21st century. Below I will very briefly explain some of the confusing abbreviations that constantly pop up when discussing ELV systems.

LAN and WLAN

LAN stands for Local Area Network – also known as SCS (Structured Cabling System). Simply put, it is the data cabling in the building to enable users to network their computer devices and possibly access to internet.

In a small building, this can be simply CAT6 cabling from an MDF (Master Distribution Frame – i.e. equipment rack) while in larger infrastructures, you usually see multiple IDFs (Intermediate Distribution Frame) interconnected by fiber optic cables. (It is interesting to know that while we have no electricity passing fiber optic cables, they are still categorized as “ELV”!)
WLAN stands for Wireless LAN which is the network of wireless access points that provide wireless network coverage within and outside the building.

As you will see below, as the technologies enhance, more and more other ELV systems depend on LAN infrastructure of buildings.

Telephony Systems

While up to some years ago, analog telephony systems were still commonly used in buildings and were setup by using multi-pair copper telephone cables connecting the PABX (Private Automatic Branch Exchange) to telephone sets, they are now almost totally replaced with VOIP (Voice Over IP) solutions that require no separate cabling and depend on the building’s LAN infrastructure for the interconnections.

CCTV

CCTV stands for Closed Circuit TV (again a very old acronym which shows the old “electrical” roots of such systems). Simply put, these are the camera systems setup inside and outside of buildings to provide monitoring surveillance. Old analog cameras used a separate cabling of coaxial cables connecting each camera directly to the DVR (Digital Video Recorder). These are also today almost totally replaced with IP Cameras utilizing the common LAN infrastructure of the building.

ACS

ACS is the abbreviation for Access Control System. ACS systems give access to different building locations (usually implemented by automatic unlocking of doors) through different means of authentication of people (by magnetic or RFID identification cards, by finger print, IRIS or face recognition). Almost all new ACS solutions also rely on LAN infrastructure to some extent, while they also include electrical cabling to magnetic door locks, manual push buttons, and magnetic sensors installed on the doors and entry gates.

IDS

IDS (Intrusion Detection System) is the common name for a broad range of technologies which as the name suggests, alerts on any attempt for intrusion to a building or premises. They include long and short range radar systems, fiber optic cable systems connected to fences, IR motion detectors, CCTV video analyzing software, and many other technologies.

Fire Alarm

Fire Alarm Systems (also abbreviated to FA or FAS), can be divided into two main types – conventional and addressable. Most FA systems still use 2-wire electrical cables for interconnecting of the sensors (smoke, heat, combined) and beacons/alerts to the control panels. Newer fire alarm solutions also provide LAN connectivity for integration with other systems.

PAS / PAGA

PAS (Public Address System) or PAGA (Public Address and General Alarm) is the speaker system installed in buildings for making announcements, playing background music and broadcasting pre-recorded alarm notifications, sometimes automatically triggered by fire alarm systems. PAS is probably one of the few ELV systems that is still not much IP based and use twisted-pair electrical cables for connecting the distributed speakers to the power amplifiers. However most of newer PAS systems have accessories to enable utilizing LAN infrastructure for interconnecting the main components and provide a distributed design.

SMATV / CATV / IPTV

SMATV (Satellite Master Antenna Television), CATV (Cable Television), and IPTV (IP Television) all explain different technologies to provide a TV distribution system within a building, interconnecting multiple television sets to a single source (usually called Head End) so each television can select the desired watching channel from a selection list.
While SMATV and CATV have their separate cabling network based on coaxial (or sometimes fiber) cabling, they are rapidly being replaced with IPTV solutions which rely on the same LAN infrastructure jointly used by other systems.

Home Automation

Home automation systems include a very broad range of technologies for monitoring and controlling almost everything in the building from lights to doors to home appliances and audio equipment and in short whatever works with electricity in the building. Most home automation solutions are now network based and give the option of remotely controlling and monitoring the building over the internet. While in the past, most home automations were using some remote control device, with advance of technology the means of control is now moved to smartphone and tablet applications or voice recognition solutions.

Above are merely samples of ELV, and there are myriad other systems which are not discussed above that are collectively named as “ELV” in construction terminology. One wonders, isn’t it the right time that we change the name to something that better explains what all these systems are all about? What about MIT (Modern Infrastructure Technologies)?

Why hiring a single ELV designer makes no sense

I constantly encounter job advertisements where a construction design company is looking to hire a single ELV designer to handle all their ELV works. Well, for those familiar with the industry, this simply causes a bitter smile – since there is no such magic!

This wrong perception is mainly because of a simplified comparison of ELV designs with electrical designs (maybe due to the “Extremely Low Voltage” terminology)! However, unlike electrical designs, where an electrical designer can very much do all the principle LV designs that is required for a building, in ELV we’re dealing with a very broad range of diverse technologies including but not limited to data, telephony, CCTV, ACS (Access Control System), PAGA (Public Alert and General Alarm), Fire Alarm, IDS (Intrusion Detection System), television systems, audio / video solution, and home automation.

Expecting a single engineer being able to properly cover all these technologies is like referring to the same doctor as your Dentist, Allergist, Physiologist, Psychiatrist and Cardiologist!

What is the correct Solution?

Proper ELV design calls for a team of designers who cover all the needed expertise to enable them to provide an appropriate solution through combined effort and teamwork. The actual number of designers in the team and their expertise is of course dependent on the complexity and scale of the project. However even for smallest solutions, below are the key members of an ELV design team:

  • Team Leader: the Team Leader should have a good general knowledge of all technologies and the needed capabilities to lead the team and to make sure the provided design covers all the specified criteria.
  • Network Engineer: IP Networking is the foundation of any modern ELV design. Proper design of the network infrastructure is therefore the fundamental necessity to guarantee a successful solution.
  • Safety and Security Engineer: a good engineer with sufficient expertise and experience in designing safety and security solutions including CCTV, ACS and Fire Alarm is a key member of any ELV design team.
  • Audio / Video Engineer: audio / video solutions are another part of most ELV designs which have their totally separate field of expertise and hence an audio / video expert is a key member of any ELV design team.

Designing the proper ELV systems – a challenge in today’s construction industry

Contrary to what we used to have 20 years ago, it is now very hard to find a new structure where ELV solutions are not part of the core requirements for construction – whether the structure is a hotel, an airport, an office, or an industrial plant.

This is indeed a challenge for construction designers who are not well versed with the technology concepts of the ELV and all the fuzzy technical jargon that come with it.

ELV is an abbreviation for Extremely Low Voltage. This acronym falls short in explaining the exact content and is mostly misunderstood from an “electrical” perspective, rather than what it really represents.  ELV is actually a combination of data & telephony, safety & security, as well as automation technologies that are now an integral part of any modern structure in the 21st century.

While civil, electrical and sewage designs are considered to be integral disciplines of construction design and are normally done in-house by most EPC companies, design of ELV is more like a Pandora’s box to most experienced construction designers, and they prefer to have it totally handled by “someone else”!

When it comes to “ELV designers”, you can categorize them into 3 different groups:

  • The first group of ELV designers come from an “electrical” background: their work usually include some general block diagrams similar to one-line drawings one finds in electrical designs and totally lack the understanding of the diverse “technological” aspects of the different ELV systems.
  • The second group of ELV designers come from an “IT” background. Although this group understand the technological aspect of a proper ELV solution, they can’t communicate this very well in the language and the culture that is commonly known and used in the construction industry.
  • There is only a small group of ELV designers – the third group – who really master this new technology whose time has come. A combination of “electrical” and “IT/technological” knowhow necessary in order to properly design the ELV systems needed for this day and age.