High-Tech Bridge, a web and mobile app security testing company, released a report on application security trends from the data collected from open sources, ImmuniWeb Testing platform, and their own web security services this year at Infosecurity Europe 2017. The report gives a good insight on the immediate evolution of application security in general, and on the context of technologies like IoT going mainstream.

Here are the highlights of the report.

Bug bounty fatigue trend will keep progressing

The report states that 9/10 web apps in the scope of a private or public bug bounty program, running for a year or longer, contained at least two high risk vulnerabilities that the crowd security testing hadn’t detected.

The fact that Google’s Project Zero Prize got no valid submissions makes it evident that researchers are not likely to take up a project for which they may or may not be paid. The bug bounty program is only seen as an ‘easy-money’ opportunity, explaining the lack of thorough research when it comes to crowd security things. This is why many high-risk vulnerabilities go unnoticed. Now, Qualys and BugCrowd have begun a partnership to employ researchers, offering them a full-time job without risks in the industry to get better results.

Security risks for the web interfaces of IoT devices

IoT is an innovative technology but is still in infancy, which means there are security risks at present. High-Tech Bridge’s research found that over 95% of web interfaces and panels of IoT devices noteworthy security problems such as outdated software without update support, admin credentials that can’t be modified, and other critical vulnerabilities.

Human error still poses a risk to DevSecOps approach

A good majority of companies following the DevSecOps approach, had at least one critical vulnerability which was a result of human error. For instance, humans may carelessly keep a secure web app on a location that’s accessible to anyone without credentials. Reportedly, this can only get complicated in a bigger organization as numerous decision-makers and data handlers would be changing their decisions simultaneously. The same applies to an Agile team. The bigger the team, the harder it becomes to preserve order.

Web server security needs improvements

According to the report, Content Security Policy (CSP) and various other security measures have only been fully implemented in close to 2.5% of global web servers. Though security breaches aren’t high enough to be concerned, the report’s findings emphasize more awareness for potential security risks in web servers.

Web Application Firewalls are unable to guard against complex flaws

Despite commercial web application firewalls (WAFs) protecting 22% of SQL injections in a web application, they all were found to be fully exploitable, as in cyber criminals would be able to extract sensitive data from the database with relative ease.

Reportedly, the various WAF bypass techniques were capable of at least partially breaching 58% of these vulnerabilities according to the study by High-Tech Bridge. 2018 is expected to introduce solid improvements for web application firewalls to guard against the most complex security threats.

Conclusion

The report also mentions breaches via mobile backends of apps, risks like XSS, CSRF, dwindling reliability of HTTPS encryption etc. The findings demand a drastic upgrade to various security protocols associated with web and mobile applications.

But despite the security risks, capable web and mobile app developers can still come through at the end of the day and provide services to ensure that the apps are secure. If you want to learn more about secure web and mobile applications, feel free to contact us.

Image Designed by Freepik


In just a few years, IoT expanded across the world connecting millions through various devices. IoT-powered sensors have now started empowering businesses in new ways, changing how they operate and how they interact with people. And apparently, this is just the beginning.

IHS predicts that there will be over 75 billion IoT devices in the world by 2025.

The phenomenon is thus expected to bring more surprises next year, with new trends that will radically transform businesses and people’s lives in the coming years.

Here are a few IoT trends that may shed some light on the innovations lying dormant at the moment, and could very well come to light next year.

Impact on retail

Obviously, the one sector that IoT is going to impact the most is certainly retail. At present, the arrival of IoT sensors was lucrative not just to businesses but consumers alike. It’s already profitable, and businesses now have innovative ways to market their products.

Many enterprises have started investing in sensor-based analytics not only to get insights on better and more engaging ways to reach out to customers but also to track customer behavior and purchase patterns across their stores. Sensor-driven retail shopping will soon change how people find and buy products they want.

Remolding healthcare services

The growing presence of big data in the healthcare industry and the benefits it brought also lifts hope in what IoT can do once it finds the ground in healthcare. For starters, IoT is expected to change the way people access and pay for healthcare services.

Considering the fact that many wearable gadgets now come with healthcare apps and features, IoT might just open doors to expanding this kind of services further. Imagine a scenario where networked gadgets in hospitals remind patients when it’s time to take prescriptions or alert doctors when a patient’s blood pressure changes. IoT can make this a reality.

Reformed security measures

So far, it’s clear that IoT will bring about changes. And not all changes are positive. Advancements in IoT will certainly end up giving IT experts many challenges to overcome, primarily when it comes to network security. With billions of IoT devices spread across the globe, the security of these devices would be a matter of concern that needs to be addressed as soon as possible.

Though many have already started exploring potential risks and possible security enhancements, we’d most likely see them fall into place only next year. Cyber criminals would be trying to figure out ways to use IoT for all the wrong purposes. This means the good guys will have to go past their limits to learn about the technology and identify its vulnerabilities to preemptively enhance the defenses before IoT devices become too common. Blockchain is expected to play a vital role when it comes to security enhancements.

Blockchain meets IoT

Cryptocurrency built on blockchain was a phenomenal success, thus making blockchain one of the greatest developments in IT at present. It’s now eyed on for its capability to facilitate seamless and secure transactions at a considerably reduced cost allowing businesses to close deals faster, while also turning insights from blockchain’s data into assets.

Combining it with IoT promises great potential where, for instance, the real-time data from IoT channels can be used across transactions in a secure way without affecting the privacy of the parties involved, in any way. Blockchain is also expected to address many security challenges of IoT including data theft. Both technologies combined can be mutually empowering while opening new doors to data transactions and data monetization.

More startup capitalists, more access to capital

IoT already turned the heads of investors, but with many questions on the technology remaining unanswered or disputed, many investors are still reluctant to fund the technology’s expansion. However, things are becoming clearer now and 2018 will come with a catalyst that will boost the confidence of investors to take IoT more seriously.

This is why experts speculate that IoT-based ventures will not be short on startup capital next year. The transportation, mining and manufacturing industries have already started investing on IoT to gain an edge in the soon-to-change market. Despite the fact that we don’t know what IoT is truly capable of and that it’s still in its infancy, IoT ventures will still have priority next year.

Conclusion

A technology that people were skeptical about just a few years ago is now showing itself though not in all its glory. Still, both consumers and enterprises alike have embraced the technology owing to the kind of impact it would have on global commerce and the way people live and interact.

If you have any ideas on tapping into IoT to drive your business, we can be of help. Get in touch with us to learn how we can leverage IoT to future-proof your business and set it to grow more quickly.

Image Designed by Freepik


Blockchain – One of the most significant developments in IT in recent years has the potential to turn insights into assets for businesses when combined with analytics, and promises data security and integrity on a different level. It still hasn’t gone mainstream but is on its way there subtly transforming businesses, financial corporations, and even governments.

According to Wikipedia, blockchain is basically a growing list of linked and secured records or ‘blocks’. The World Economic Forum (WEF) defines it in a different way.

A technology that allows parties to transfer assets to each other in a secure way without intermediaries. It enables transparency, immutable records, and autonomous execution of business rules.

“Immutable” is the keyword here. Thanks to blockchain, the information in a network remains in the same state, and impossible to alter, as long as the network exists. The ongoing evolution of blockchain is expected to improve many areas including but not limited to immutable entries, audit trails, timestamping etc.

When Big Data Meets Blockchain

You can’t find a good connection between blockchain and big data in the context of Bitcoin. But in a scenario where blockchain itself is a ledger for financial transactions, things would be different. The prospects are surprising, and this applies to even stock trades and business contracts.

This is also why the financial services industry is keeping watch on blockchain, which can potentially reduce processing time from days to minutes.

In the financial services sector, blockchains will take the form of a grand canyon of blocks that will have full history of every recorded financial transaction. All it lacks is analysis because what blockchain can essentially do is to provide integrity for the ledger.

This is where big data comes into play.

What Big Data Analytics Can Do

This year, a consortium of 47 Japanese banks partnered with Ripple, a blockchain startup, to test a blockchain project that facilitate money transfers between bank accounts. The goal was to reduce the cost of real-time transfers, as real-time transfers are riddled with risk factors like double-spending and other potential transaction failures. Blockchains managed to avoid most of the risks. In this scenario, big data analytics can make a significant difference one of which, for instance, is that it can identify patterns in the way consumers spent. It can also identify risky transactions far quicker and better than through any other current means. This can considerably reduce real-time transaction costs.

For sectors other than banking, blockchain adoption is primarily a security enhancement measure. Healthcare, retail, and government establishments have started leveraging blockchain to prevent hacking and leaks.

In Real-Time Analytics

Real-time fraud detection was practically just a concept till the arrival of blockchain. Organizations relied on using technologies to predict and prevent attacks contemplating past events. With the blockchain’s database record for each transaction available, they can use real-time analytics to mine for patterns if necessary.

This feature, however, can be seen from two different perspectives.

  1. Because it can provide a record of every transaction, there is a concern that this can be exploited for wrong purposes.
  2. Such improved transparency in data analytics granted by blockchains also promises analytics accuracy much better than that achieved with other means.

A Massive Potential to Uncover Data

Blockchain is gradually establishing its presence across multiple sectors. Considering its own growth, the data within the blockchain on banking, microtransactions, remittances etc. will soon be worth trillions of dollars. The blockchain ledger is speculated to be worth at least 20% of the total big data market by 2030. The potential revenue is huge enough to overshadow the combined revenue of Visa and Mastercard.

As big data will be crucial for this, data intelligence services will start popping up everywhere to help organizations uncover social and transactional data, and identify ‘hidden’ patterns.

Simply put, blockchain would most likely be the harbinger of new forms of data monetization by creating new marketplaces where businesses and individuals can share and sell data and insights directly without a middleman. Businesses intending to leverage blockchain will have to use the best AI/ML solutions on top of the blockchain-driven data layers to get a competitive edge in the present market. The wide-scale adoption of Bitcoin and the growth of blockchain in parallel combined can revolutionize conventional data systems to facilitate faster and secure data transactions in a seemingly insecure cyber realm.

It’s high time to start thinking about leveraging blockchain for your business. AoT technologies have the AI/ML capabilities to support just that. Feel free to talk to us to learn more on how blockchain combined with AI/ML can transform your business.

Image Designed by Freepik


According to the dataset from Redmonk and Bitnami, about 30% of container deployments are in production environments.

The impact of the open source Docker on DevOps and virtualization is evident from how developers discuss the prospects all over the internet, and from its increased use in production environments of both SMBs and large-scale enterprises.

Last year, Datadog published a report on Docker adaption among their users, which showed a 30% increase in Docker adoption in just a single year. 

Docker’s growth momentum is still consistently rising, and along with the capability it gives developers to create, deploy, and run applications easier makes it a vital element in a DevOps ecosystem. Though DevOps pretty much covers the entire delivery pipeline, Docker optimizes the production environments.

Before Docker made a name for itself

Before Docker came into being, developers, testers and the ops team relied on a plethora of tools for configuration management. In addition, they had to deal with the complex integrations and other issues inevitably delaying the project not to mention making it more complex in most cases.

The team will have to make use of various environments that should be optimally aligned to meet the project’s goals. Achieving that alignment requires a lot of effort as well. Conclusively, development wasn’t efficient or fast then compared to how it is with Docker now.

The need for Docker arose primarily due to the evolution of application complexity over the years.

Relief for DevOps teams

Software developers in a DevOps environment are well aware of what Docker can do as a reliable environment for development. It allows the team to configure both development and test environments efficiently, subsequently resulting in successful release management.

With major cloud platforms like Microsoft Azure and Amazon Web Services offering support to the open source container system, Docker allows DevOps teams to deploy to any platform without concern for the underlying complexities. Add to that an extensive collection of official language stacks and repositories from DockerHub, and they have one of the most powerful tools to get the job done quickly.

Ops teams can package an entire application as a Docker image without compromising the build version before it gets added to a central registry. They don’t have to individually deploy EXE and JAR files to the target environment. The Docker image can then be taken by the various environments (development, testing, production etc.) for final deployment.

While the developers are relieved from worrying about setting up and configuring specific development environments every time, the ops team or system administrators become capable of setting up environments (thanks to Docker) akin to a production server, allowing anyone to work on the project with the same settings. 

Docker’s role in DevOps

To begin with, let’s take Wikipedia’s definition of DevOps.

DevOps is a culture, movement or practice that emphasizes the collaboration and communication of both software developers and other information-technology (IT) professionals while automating the process of software delivery and infrastructure changes.

That said, in such an environment, Docker finds its use both as a platform and as a tool. Developers can use it as a platform to run applications while the operations staff can use it as a tool to facilitate integration with the workflow.

With Docker as a platform, the developers can focus on building good quality code. Despite being isolated, Docker containers share the same kernel and OS files. This makes them lightweight and pretty fast; enough to make it one of the best ways to easily and efficiently build distributed systems by allowing applications to run on either a single machine or across many virtual machines. It comes with a cloud service to share applications and automate workflows.

Generally, once development and testing of an application are done, the ops team will take up the responsibility to deploy that app. Before Docker, this phase was quite challenging as issues that didn’t occur during development might show up, giving sleepless nights to the team. Docker eliminates this friction allowing the ops team to deploy the application seamlessly.

Docker-based pipeline in a DevOps environment considerably reduces risks associated with software delivery and deployment. In addition, it ensures timely delivery at a cheaper cost. It effectively unifies the DevOps community as well, supporting the use of popular provisioning and configuration tools like Chef, Ansible, and Puppet etc. From a technical standpoint, Docker facilitates seamless collaboration which is the core essence of a DevOps ecosystem.

The present state of Docker

With Jenkins, another open source tool, becoming more popular thanks to its efficiency in orchestrating complex workflows, developers have started exploring the results of combining it with Docker.

Docker, Inc. decided to invest in build automation last year, and the community behind Jenkins developed many plugins for effective Docker-Jenkins integration. This ended up expanding the capabilities of Docker at the hands of developers, allowing them to create and implement build pipelines on Docker.

Word soon got out, and now many startups have finally started seizing the opportunity to leverage the potential of Docker-based build automation.

CloudBees, one of the first companies, embraced Jenkins and Docker’s build automation to evolve from being just a PaaS player, by offering professional support and services for enterprises planning to adopt Jenkins and Docker.

Shippable, another company, adopted Docker for software build automation.

All these facts and more emphasize the dominating presence of Docker in today’s development realm whether it’s being used in a DevOps system of a simple, small startup or an enterprise with large teams.

Conclusion

According to Datadog’s report,

Docker adopters quintuple (5x) the container count within 9 months after initial deployment. 

Because it’s open source, it brings more perks to the table. Its ability to maintain consistency, productivity, and compatibility while providing reliable security and support for multi-cloud platforms in addition to major corporate backing makes Docker a valuable tool for companies putting their faith on DevOps.

With support from a huge and growing community, Docker will most likely be enhanced in the immediate future providing more out-of-the-box features and more integration choices.