LogoLogo
HomeSign UpGithub
  • Home
  • Release Notes
  • Getting Started
    • Create your first Test
    • Create Run Settings
    • Analyze the results
      • Job Summary
      • Logs
      • State
      • Payload
    • Look deeper with Metrics
  • Concepts
    • Workspaces
      • Role Based Access Control (RBAC)
      • Invitation Management
      • GitHub Integration
      • Deletion of Workspaces
      • System Status
    • Understanding Tests
      • Stages Management
        • Init Stage
        • Running Stage(s)
        • Finished State
      • Generating Messages
        • Scripting Environment
        • State Object
      • Response Handler
      • Preview Tests
      • Exporting/Importing Tests
        • Import OpenAPI JSON/YAML
      • Locking/Unlocking a test
    • Stateful Simulation
      • Mapping the IoT device lifecycle
    • Protocol Settings
      • MQTT
      • HTTP
      • Using other protocols
    • Run Settings
      • Network Simulation
      • Execution Strategies
      • Client Distribution
    • Scenarios
    • Glob Storage
    • Metrics
    • Mailbox
    • Licensing and Limits
    • Deployment Models
  • Additional Helpers
    • External Libraries
    • Inbuilt Libraries
    • IoTIFY Helper Functions
      • Job Functions
      • Messaging Functions
      • Glob Functions
      • Metrics Functions
      • Mailbox Functions
      • Data Generation Functions
  • platform integrations
    • AWS IoT Connector
  • Guides
    • Smart City
    • Smart Home
    • Load Testing Kafka
  • IoT Testing
    • Overview
      • Feed offline sensor data from Google Sheets to your IoT platform
    • Functional Testing
      • Basic functional test
      • Geofencing Validation
    • Performance Testing
      • MQTT end to end latency Measurement
    • Security Testing
    • Load Testing
    • Test Automation & CI/CD integration
  • API
    • Simulation API
    • Glob APIs
    • Metrics API
  • Glossary
  • Create Account
  • TEMP
    • Getting Started
      • Beginner
      • Developer
      • Tester
    • Walkthrough
    • Protocol Settings
      • CoAP
      • Raw (TCP/UDP/TLS/DTLS)
      • LWM2M
      • NONE
    • Under the hood
    • Google Sheets API
    • Azure IoT
    • Losant IoT
      • Losant Connector
      • Parking Space Management
      • Waste Management
      • Connected Truck
      • Delivery Van
    • Google Cloud IoT Core
    • IBM Cloud
      • Simple Messaging
      • IBM Bluemix: Monitoring Energy Consumption
    • Dweet.io
    • JMeter and why it fails at IoT
Powered by GitBook
On this page

Was this helpful?

  1. IoT Testing

Performance Testing

IoT platform performance testing ensures that your infrastructure could handle Millions of devices at scale without deteriorating underlying services.

IoT cloud solutions are designed for scalability, but no one knows what will happen when we actually reach that scale. It may take a few years before your platform reaches the desired scale and capacity and if there are some critical design flaws, they wouldn't show up until its too late in the system. So it is important to test your platform for full scalability as soon as possible. The problem is how?

Orchestrating a large scale test comes with its own challenges. First, you have to synchronize the execution, so that most of clients start at once. Then you have to perform individual tests and collect results into a nice visualization to understand. Lastly, if something doesn't work, you will need to dig down deeper and triage who was at fault. All of this, could create significant work load on your test teams.

IoTIFY provides a neat and elegant solution to performance testing challenges at scale out of the box. Here is how: -

Seamless scalability: The scalability is just a number when it comes to orchestration. We manage all the challenges required to orchestrate upto a million endpoints, so for you it's a no brainer. Simply spawn the devices you want to be simulated and we take care of the rest.

Out of the box measurement: The basic performance metrics such as message sending delays, message generation delays are measured out of the box by the tool. By adding some simple logic, you could also measure application level latencies with IoTIFY and visualize them in nice graphs via metrics() API.

Detailed Result Capture: Each client and iteration is captured in detail by IoTIFY. You could go and drill down to exact payload sent by each client, how long did it took for the iteration to complete, any received messages from the cloud, and total time it took to complete that iteration. As a result you could always find out what happened wrong, when triaging a situation.

Advanced Analysis: Thanks to in built REST APIs within the template, your template could also measure some internal parameters of the cloud platform (such as CPU usage, message queue congestion) and save them in correlation with device data. Furthermore, your payload contents could be changed dynamically based on the cloud response, therefore making your test even more smarter.

Let's have a look at some more performance testing example to understand the functionality.

PreviousGeofencing ValidationNextMQTT end to end latency Measurement

Last updated 4 years ago

Was this helpful?