DNS, DHCP & IP Address Management appliances
For Microsoft DNS & DHCP servers
For open source DNS & DHCP servers
Cloud-based visualization of analytics across DDI architecture
Manage multi-vendor cloud DNS servers centrally
RIR Declaration Management and Automation
Automated network device configuration and management
Centralized visibility over all your clouds
A single source of truth for your network automation
Why DDI is an Obvious Starting Point
DNS Threat Intelligence for proactive defense
Intelligence Insights for Threat Detection and Investigation
Adaptive DNS security for service continuity and data protection
Improve Application Access Control to prevent spread of attacks
Protect users and block DNS-based malware activity
Carrier-grade DNS DDoS attack protection
Optimize application delivery performance from the edge
for Proactive Network Security
Visibility, analytics and micro segmentation for effective Zero Trust strategy
Enable work from anywhere by controlling access, security and data privacy
Simplify management and control costs across AWS, Azure and GCP environments
Risk-free migration to reduce DDI complexity and cost
Move risk-free to improve performance, security and costs
Automate management, unify control and strengthen security of connected devices
Protect your network against all DNS attacks, data exfiltration and ransomware
Enable zero touch operations for network management and security
Improve resiliency, deployment velocity and user experience for SD-WAN projects
Integrated DNS, DHCP, IPAM services to simplify, automate and secure your network.
Simplify design, deployment and management of critical DDI services for telcos
Optimize administration and security of critical DDI services for healthcare
Simplify and automate management of critical DDI services for finance
Simplify and automate management of critical DDI services for higher education
Simplify and automate management of critical DDI services for retail
Simplify Management and Automation for Network Operations Teams
Open architecture for DDI integration
Technology partnerships for network security & management ecosystems
Extend security perimeters and strengthen network defenses
Submit requests for temporary licenses
Submit access requests for EfficientIP knowledge platforms
Submit membership requests for EfficientIP Community
Strengthen Your Network Protection with Smart DNS Security
Customer-centric DDI project delivery and training
Acquire the skills needed to manage EfficientIP SOLIDserver™
Identify vulnerabilities with an assessment of your DNS traffic
Test your protection against data breaches via DNS
Dedicated representation for your organization inside EfficientIP
Explore content which helps manage and automate your network and cloud operations
Read content which strengthens protection of your network, apps, users and data
Learn how to enhance your app delivery performance to improve resilience and UX
Why Using DNS Allow Lists is a No-Brainer
This enterprise-grade cloud platform allows you to improve visibility, enhance operational efficiency, and optimize network performance effortlessly.
Who we are and what we do
Meet the team of leaders guiding our global growth
Technology partnerships for network security and management ecosystems
Discover the benefits of the SmartPartner global channel program
Become a part of the innovation
The latest updates, release information, and global events
October 22, 2019 | Written by: EfficientIP | DDI, Internet of Things, IPAM, Virtualization & Cloud
Application Traffic RoutingData PrivacyDDIDDI ManagementDDI ServicesDDI SolutionsDNSDNS GSLBDNS ManagementEdge DNS GSLBEnterprise Network SecurityIoTIP Address ManagementIPAMIPAM RepositoryNetwork AutomationZero Trust
The future of cloud computing is near the edge. Edge computing brings compute and storage at the network perimeter, closer to usages. According to the ECCE (Edge Computing Consortium Europe) “The Edge Computing paradigm describes an approach to execute certain services closer to devices and thereby supplements centralized Cloud Computing solutions”. However, this new architecture approach brings with it infrastructure and network challenges. Included in this context of increased complexity are the IPAM repository and network functions such as DNS and application traffic routing.
By using data centers, either on premise or in the cloud, we have solved many IT problems. Concentrating resources in the data center is a smart move. It allows optimization, increases infrastructure resiliency and reduces energy consumption of all hardware thanks to virtualization. The movement towards cloud however highlights new challenges implied by regulation, new usages and security. These could push the enterprise to move back some data and workloads nearer to the usages and the users.
Bringing computation power and storage near the edge isn’t really an issue from a technological viewpoint. It requires the use of modern architectural templates used in data centers. It also needs to embrace service continuity constraints and data regulation when necessary. The first obvious feature we can think of with edge service is the Content Delivery Network. CDN brings storage for caching as well as intelligence to direct the user to its content. It can also optimize performance when storing static content in its local cache. While CDN brings a lot of value for content delivery, it is limited to cached objects. However, moving the location of these objects to the edge of the cloud reduces the response delay for the users and limits bandwidth consumption on the global network. This brings clear enhancement to the user experience.
More and more use cases are requiring computing power, not just caching of content. For example, all of your connected objects (IoT) may produce a large amount of data for analysis, but only a small portion of this data is really required centrally. The trending and alerting model (think digital meters) can be run at an intermediate level, enabling multiple levels of optimization. Using edge computation power in this case is valuable: data used solely for triggering alerts could be quickly discarded, and no bandwidth is consumed because only events will be sent back to the cloud. Users will be given access to only valuable events, which are far more interesting than just raw data. In addition, not all organizations have the capacity to store the huge volumes of data produced for a potential future usage.
Regulation in specific regions forces companies to store and process data within the same region (e.g. GDPR in Europe). Cloud approaches, especially public clouds, are not always adapted to answer location requirements. You may need a specific service which is offered by a cloud provider but only in another region. This requires data to be transported back and forth between services and regions, which is really complex to set up and mostly impossible to audit in a timely manner. Furthermore, cloud providers are not always forthcoming on the location of the data transiting between their services.
Usages for edge computing are extremely interesting and various. More and more mobile objects are requiring computation but many may suffer from insufficient local CPU. Even if not all usages are business oriented, most will bring real benefit to their users or to their communities. We need to take good care of these new device families.
Sensors and captors are collecting more and more data, which in some cases cannot be sent to the cloud due to technical restraints. Using local computation for analysis could therefore be really beneficial. Imagine you need to analyze all the data collected by an airplane (engine, structure, weather conditions) during the time it is on the ground. Having to send the data for computation in the cloud would require very large bandwidth, as well as short transit delay, which is not always compatible with cloud (see long fat network). Another example would be medical and health data that may require storage in specific certified environments where the levels of control are high. Protecting personal health data may not be suitable for public cloud providers, nor for most hosting operators due to strict constraints.
For sure there is a vast topic with IoT (Internet of Things). Even if the media sometimes misuse the terminology, usages are increasing. IoT covers a large variety of devices from the light bulb to the autonomous vehicle. Edge computing can bring benefit for storage of data for intermediate treatment, synchronous to asynchronous gateway, anonymization, event triggering or machine learning execution (e.g. for predictive maintenance). Some devices may require real-time computation power, like in AR/VR applications in the industry or maintenance operations. Other usages may also require video and audio processing with very short delays for continuous and smooth human interaction. Sending data for analysis to a central data center with 150ms transit delay is not compatible with human real time perception (<100ms). For example, take a look at the transit delays from your browser to Azure blog object storage with this tool.
We also have the domain of monitoring and alerting. Industries and utilities haven’t waited for the internet of things to make use of information collected from the field. The main usages include enhancing services, monitoring activities and billing customers. Industrial sensors are everywhere and the data or alerts they manage are really important in our day-to-day lives. Devices have moved to special radio networks (eg LoRa, Sigfox), some of which are using IP. We are generally talking about smart metering and smart cities with these smart things.
Another important subject covers tracking and inventory of assets. This can be in a large warehouse or in a harbour with containers where assets are constantly moving and decisions need to be taken by the handling vehicles or preparation robots. The amount of data linked to lidars and trackers mapping system needs to be handled quickly and is volatile enough to not be stored. In this space, not all devices are intelligent nor IP connected. We need some gateway between the IP world and the “thing world” (e.g. RFID tags). This kind of transformation demands proximity to the usage, making edge computing an ideal solution.
The area of farm monitoring uses sensors for which automation could be provided at the edge of the network. The wide variety of sensor uses includes animals with smart collars (mainly for localization and health monitoring), crop watering through sprinklers and autonomous vehicles for specific treatment.
All these devices require quick interaction with local computing resources as the cloud network could be far away. Most farms are not well covered by high speed network operators, so access to cloud resources may not be reactive enough for some usages. The amount of data required for inferring and acting can be really large and volatile, meaning that pushing this data to any cloud workload service is probably not optimal.
Weather monitoring can also take advantage of edge computing. By analyzing data from local sensors on the farm and global ones gathered from providers, quick decisions can be taken which are valid locally.
The list of edge computing needs is very long and usages are evolving with disruptive approaches that require IT technology to adapt quickly. Fortunately, new solutions are becoming available every day. We see three very different families of edge computing approaches:
To complement these three families, the new generation of mobile networks is proposing in its standards a way for providers and enterprises to host part of their computation power related to mobile devices at the edge of the network (see 5G Multi-access Edge Computing – MEC). This proposed solution is directly linked to NFV approaches that help virtualize the network.
Moving part of the storage and compute power from central clouds to edge requires attention to the way IT systems are built. Obviously virtualization and technologies like containers will help move application chunks from one location to another with enough abstraction to limit the impact. Networks will have to adapt as well, multicast for service discovery, integration of repositories directly in the network, SDN, NFV and zero trust approach for security topics. Log and analytics systems will be mandatory in order to know what job has been processed where, not only for billing purposes but also for traceability.
One serious challenge with edge computing extension to cloud concerns management of the IP space. More precisely, the need to address IP inventory and routing, naming convention, resource inventory, device management, DNS and application traffic management. All these subjects today are considered as commodity by enterprises, but engineers will need to reconsider their views in order to effectively push storage and workload into totally virtual environments at the edge. Pivoting with an IPAM solution highly coupled to this ecosystem and with the engines like the DNS will ease the movement and allow continuity in the way IT systems are operated. Handling complexity requires good understanding, rigor, an accurate dynamic inventory and, without a doubt, smart automation.
When our goal is to help companies face the challenges of modern infrastructures and digital transformation, actions speak louder than words.
Explore content highlighting the value EfficientIP solutions bring to your network
To provide the best experiences, we and our partners use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us and our partners to process personal data such as browsing behavior or unique IDs on this site and show (non-) personalized ads. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Click below to consent to the above or make granular choices. Your choices will be applied to this site only. You can change your settings at any time, including withdrawing your consent, by using the toggles on the Cookie Policy, or by clicking on the manage consent button at the bottom of the screen.
We use cookies to enhance your browsing experience, serve personalized content, and analyze our traffic. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site.