Skip to main content

Sony unveils SDK for polarised camera modules

Sony Europe’s Image Sensing Solutions has launched a software development kit (SDK) for polarised camera modules which it says cuts machine vision application design time and costs. Stephane Clauss, senior business development manager Europe at Sony, says the company has worked with customers to identify key functions for the XPL-SDKW and develop optimised algorithms. “Depending on the dev team and application, a standard polarised-camera application would typically take between 6 to 24 months,” he
May 15, 2019 Read time: 2 mins

4551 Sony Europe’s 5853 Image Sensing Solutions has launched a software development kit (SDK) for polarised camera modules which it says cuts machine vision application design time and costs.

Stephane Clauss, senior business development manager Europe at 576 Sony, says the company has worked with customers to identify key functions for the XPL-SDKW and develop optimised algorithms.

“Depending on the dev team and application, a standard polarised-camera application would typically take between 6 to 24 months,” he continues.

Using the SDK, and its image processing library, this can be cut to 6-12 weeks, Clauss says.

Created to run on its XCG-CP510 polarised module, the XPL-SDKW comes with a set of functions which have been developed to run on a standard PC.

A ‘Cosine fit’ function allows developers to define a virtual polariser angle for the whole image while the ‘Average’ function creates a non-polarised image from raw data to simultaneously export what a standard machine camera would see for comparison, the company adds.

According to Sony, pre-processing functions calculate various polarisation specific information like the ‘degree of polarisation’ and the ‘surface normal vector’.

Related Content

  • October 28, 2014
    Lake Image demonstrates Discovery CardInspect system at CARTES
    Imaging and scanning technology business Lake Image will be demonstrating its inline production card inspection system called Discovery CardInspect, which offers card producers the chance to detect and correct a series of defects, at CARTES.
  • July 23, 2019
    The rise and rise of robo-car
    When it comes to driverless cars, there are many variables – but one thing is for certain: autonomous driving will have a significant impact on vehicle design, says Andreas Herrmann The transition to autonomous vehicles (AVs) means that many of the factors which have shaped automotive design for the past 130 years no longer apply. At present, the design of a car is largely determined by the anticipated direction of travel: the car’s silhouette immediately shows where the front and back are. Driverless ve
  • June 28, 2019
    Baron unveils weather forecasting model
    Baron has launched a weather forecasting model which it says can be used for ITS and traffic applications. According to Baron, the solution offers predictive parameters 66 hours into the future at 3km resolution at hourly intervals. It can also run a second set of parameters 90 hours into the future at 15km resolution, the company adds. Bob Dreisewerd, vice president of development at Baron, explained the idea behind the new launch: “First and foremost was accuracy. We wanted a more accurate weather mod
  • December 13, 2013
    Daimler’s double take sees machine vision move in-vehicle
    Jason Barnes looks at Daimler’s Intelligent Drive programme to consider how machine vision has advanced the state of the art of vision-based in-vehicle systems. Traditionally, radar was the in-vehicle Driver Assistance System (DAS) technology of choice, particularly for applications such as adaptive cruise control and pre-crash warning generation. Although vision-based technology has made greater inroads more recently, it is not a case of ‘one sensor wins’. Radar and vision are complementary and redundancy