Skip to main content

Sony unveils SDK for polarised camera modules

Sony Europe’s Image Sensing Solutions has launched a software development kit (SDK) for polarised camera modules which it says cuts machine vision application design time and costs. Stephane Clauss, senior business development manager Europe at Sony, says the company has worked with customers to identify key functions for the XPL-SDKW and develop optimised algorithms. “Depending on the dev team and application, a standard polarised-camera application would typically take between 6 to 24 months,” he
May 15, 2019 Read time: 2 mins

4551 Sony Europe’s 5853 Image Sensing Solutions has launched a software development kit (SDK) for polarised camera modules which it says cuts machine vision application design time and costs.

Stephane Clauss, senior business development manager Europe at 576 Sony, says the company has worked with customers to identify key functions for the XPL-SDKW and develop optimised algorithms.

“Depending on the dev team and application, a standard polarised-camera application would typically take between 6 to 24 months,” he continues.

Using the SDK, and its image processing library, this can be cut to 6-12 weeks, Clauss says.

Created to run on its XCG-CP510 polarised module, the XPL-SDKW comes with a set of functions which have been developed to run on a standard PC.

A ‘Cosine fit’ function allows developers to define a virtual polariser angle for the whole image while the ‘Average’ function creates a non-polarised image from raw data to simultaneously export what a standard machine camera would see for comparison, the company adds.

According to Sony, pre-processing functions calculate various polarisation specific information like the ‘degree of polarisation’ and the ‘surface normal vector’.

Related Content

  • October 14, 2019
    Here unveils Live Sense road hazard SDK
    Here Technologies has released a software development kit (SDK) which it says provides real-time insights on driving conditions and upcoming obstacles without the need for connectivity. Here claims its Live Sense SDK uses artificial intelligence and machine learning to turn front-facing cameras such as smartphones and dashcams into vehicle sensors which can detect other vehicles, pedestrians or cyclists, potholes and road closures. Live Sense then provides information through audio and visual notificati
  • November 20, 2013
    Bluetooth and Wi-Fi offer new options for travel time measurements
    New trials show Bluetooth and Wi-Fi signals can be reliably used for measuring travel times and at a lower cost than an ANPR system, but which is the better proposition depends on many factors. Measuring travel times has traditionally relied automatic number plate (or licence plate) recognition (ANPR/ALPR) cameras capturing the progress of vehicles travelling along a pre-defined route. Such systems also have the benefit of being able to count passing traffic and have become a vital tool in dealing with c
  • June 27, 2018
    An innovation lab – not a burden
    Travellers want to be able to book multimodal journeys easily – and to be informed of problems and alternatives as they go. Adam Roark might just be able to help, finds Ben Spencer. The global shift in transportation towards members of the public wanting access to multimodal journeys is rapidly changing how people pay and plan ahead. Buying tickets from a machine and dealing with the frustration of discovering your train is cancelled is a scenario commuters want to avoid through technology’s ability to
  • June 2, 2014
    Machine vision makes progress in traffic applications
    Machine Vision technology is easing the burden on hard-pressed control room staff and overloaded communications networks.