Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Estimation of solar resource for vehicle-integrated photovoltaics in urban environments using image-based shade detection

Open Access Open Access

Abstract

Vehicle-Integrated Photovoltaics (VIPV) in urban environments face challenges in accurately estimating solar resource due to dynamic shading effects. This research presents a methodology for evaluating VIPV solar resource by analyzing imagery and detecting shade conditions along driving routes. Street image mapping services and obstacle detection algorithms are utilized to determine the shaded or sunny condition of the vehicle at each point. The approach enables the calculation of solar irradiance, considering direct and diffuse components, and identifies energetically optimal driving routes. The methodology provides valuable insights for optimizing MPPT algorithms and assessing VIPV performance in urban settings. It offers a practical tool for sustainable mobility and renewable energy integration.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

European mobility statistics show that the number of European urban trips (trips of less than 100 km within the same urban area) represent a substantial proportion of daily short-distance mobility. Mainly, on an average day (work or non-work) persons travel up to 19 km and do so mostly (60-70%) by car as a driver or passenger [1]. The growing fleet of electric vehicle makes the integration of photovoltaics in the vehicle surfaces (VIPV) relevant for the future of mobility [2] as the added solar range of the vehicle could cover a large part of the mobility needs depending on the model [3].

For the proper development of VIPV technology, certain advancements in modeling and characterization techniques of these types of systems are required [4], considering characteristics such as surface curvature, partial and dynamic shading, and movement. In this regard, we have previously proposed a new tool that provides insight into the solar resource on the curved surface of the car roof [5]. The present work unveils another relevant challenge, the evaluation of the solar resource for VIPV in an urban environment.

Previous approaches that estimate shading effects on PV systems as in BIPV are not valid for VIPV since the system is in movement, so a dynamic and step by step methodology is necessary considering the solar position and surrounding obstacles every moment. A diverse collection of obstacles cast a complex shadow over the roof of the vehicle, that is changing slowly during the whole day. However, it becomes a real complex challenge when the vehicle is moving following a route. In this case the shadow is not only changing throughout the day but also at the speed of movement caused by obstacles at every step of the trajectory. Therefore, factors that affect the irradiation seen by a VIPV surface are the concrete location, the free-obstacle irradiance that is function of the date and time, the vehicle orientation and the obstacles that block the sunpath at every step of the route.

There are different approaches to evaluate the shadows profiles on the vehicle when modelling the solar resource in VIPV. For example, using LIDAR data to reconstruct the 3D surrounding environment can be an accurate method for dealing with shadows for the vehicle in movement [6]. Nevertheless, LIDAR data are not always available, and the needed computational effort implies processing times greater than those required for real time applications. Another way to evaluate shadows over the vehicle is the processing of near imaginary. For this purpose, it is very convenient to have a collection of photographs of the surroundings of the vehicle along the driving route [7,8].

Including shadow effects when modelling VIPV can provide valuable insights to address some current open questions. This includes examining the types of shadows that are commonly produced, not only considering the irradiance pattern over the vehicle but also the frequency of shading. This can help to study the optimal MPPT algorithms for VIPV. Additionally, analyzing the solar resource available for a specific route can shed light on the actual potential of VIPV in urban environments.

In this study, we propose a methodology for the determination of the sunny/shaded condition of the vehicle at each point of the trajectory from photographs. Considering the orientation and the solar geometry a sunpath can be projected over the resultant photography and the losses by shading can be estimated. For simplicity, the photovoltaic system is considered to be a point system, so the resolution for estimating whether it is shaded at each instant is on/off, however, irradiance patterns over the vehicle could be also gathered allowing also partial shading analysis.

This paper is structured as follows. In “Methodology”, a procedure is presented to obtain whether the vehicle is shaded or not in an urban environment based on photographs of the surrounding along a given route. In these images, the solar trajectory is projected together with an obstacle detection between Sun and vehicle. From this information the solar irradiation during a trajectory is calculated considering equally spaced sampling points. The “Results” section first presents a validation of the solar path projection on photographs using the above procedure. In addition, to show a concrete application of the described methodology, an example to show how to find out the most energetic route for a journey in the city is presented.

2. Methodology

For the calculation of shadows in urban environments and its impact on the available solar resource in VIPV, a methodology based on images from mapping services is proposed. For a specific urban environment where the VIPV is located, the position of the Sun (given by the azimuth and solar elevation angles) is projected in the image of the surroundings for a specific moment. Shadow condition on the vehicle is evaluated whether there is any obstacle blocking the Sun in the image. The same operation can be done for different moments throughout the day, allowing the projection of the Sun's trajectory during a certain time interval onto the image. If an obstacle coincides with the position of the Sun, the direct and circumsolar diffuse components will not be taken into account for the calculation of the available solar resource. In turn, the diffuse component will be affected by a sky view factor (SVF) [911], which can also be obtained from the hemispheric projection of the images of the evaluated environment. In some specific conditions where solar altitude, building albedo and distance from the vehicle favor reflections, the reflected irradiance may have some relevance. However, the weight of the reflected component in the cumulative irradiance over the total path is expected to be low and is therefore neglected in this study.

2.1 Procedure to obtain irradiance from images

The proposed procedure to obtain the irradiance over the VIPV system on an urban driving route follows the next steps:

  • 1. Set the energetically representative sampling frequency of the route to be evaluated. The appropriate frequency for a particular route may be determined by the maximum speed of the road and the traffic on it. A sampling frequency close to 1 image per second is considered convenient, a value that Wetzel et al. [12] see as the threshold for most of the changes in VIPV irradiance. However, if a higher sampling rate is needed it may be convenient to obtain intermediate points by interpolating between photographs.
  • 2. Obtain the street mapping image of each sampled point on the route, with the proper orientation and field of view of the camera (section 2.1.1), so that it contains the solar position for the instant time under investigation. The camera should be preferably oriented to equator (south/north in north/south hemisphere) to capture most of the moments of the sunpath.
  • 3. Project the solar coordinates (azimuth and solar elevation) on the image (section 2.1.1).
  • 4. Extract the shadow profile (obstacles) of the surrounding from the image by processing with masks (section 2.1.2).
  • 5. Calculate the SVF from the images that cover the upper hemisphere (section 2.1.3).
  • 6. Calculate the irradiance on the vehicle considering whether the Sun is shading or not by means of steps 3 and 4 (section 2.1.4).
  • 7. Repeat from 2 to 6 to cover all the points sampled along the whole route.

2.1.1 Sun projection on street image mapping services

Street image mapping services [13] can be an interesting option due to its massive coverage, with the relevance that the images are taken from the point of view of the vehicle. Some examples of street image mapping services are Google Street View (GSV) [14] and Apple Look Around [15] both based on their own collection of images, and Mapillary [16] and KartaView [17], two crowdsourced services that relay on their users to obtain their images. It is remarkable that currently there are several disciplines that use these services with objectives as estimation of spatial data infrastructure [18], greenery [19], building height [20,21] or sky view factor [11,22,23].

Among these services, GSV is a ubiquitous public option, an image service that provides photographs of the streets of many cities around the world covering more than 85 countries and 10 million miles at the moment [14]. GSV obtains images by placing cameras at the roof of a vehicle, usually a car, at a height of 2.5 m [24]. GSV API expects the following parameters: Field of View (FOV) (at a maximum of 120°), geographical coordinates of the location, pitch and heading of the camera, and image size (640 × 640 pixels maximum).

Figure 1 shows some examples of GSV images in which the pitch (p) and the heading (h) of the camera have been modified. To maximize the sky seen in the photograph and to discard obstacles below horizon, GSV’s pitch can be fixed to 45°. Different orientations can be selected by using the heading parameter on GSV API; therefore, we can obtain the necessary images to cover the Sun trajectory during certain period. Note that the image obtained by GSV corresponds to a particular instant of the year, and the position of the Sun in the image, if it is present, is irrelevant since the Sun must be located for the instant under investigation.

 figure: Fig. 1.

Fig. 1. Images taken from GSV at parking of the Solar Energy Institute, Madrid (Spain) with FOV = 90°. (a) pointing to the south, heading = 180°and pitch = 0°, (b) pointing to the south, heading = 90°and pitch = 45°, (c) pointing to the east, heading = 90° and pitch = 45°, (d) pointing to the west, heading = -90°, and pitch = 45°

Download Full Size | PDF

The position of the Sun relative to an observer on the Earth's surface is described by the azimuth and the solar elevation angles (ψS, γS), that can be obtained following astronomical formulas. To project the Sun's path onto an image, the azimuth and the solar elevation angles must be converted into pixels on the photograph (u, v). Additionally, the optical axis of the camera is defined by the angular coordinates (h, p), which correspond to the heading and pitch of the images in GSV. This point is coincident with the center of the photograph and serves as a reference point. A formal approach to project objects in a photograph, such as locating the Sun coordinates (u, v) for a given instant, requires the calibration of the camera [25] to account for the sensor and optics performance.

A GSV image contains certain paths of the Sun, which vary depending on the location and moment of the year being considered. Additionally, whether an image covers more or fewer solar positions (elevation and azimuth) also depends on the FOV, and camera optical axis (pitch, and heading) as shown in Fig. 2.

 figure: Fig. 2.

Fig. 2. (a) Schema of the image coordinates as seen by the camera (origin), pointing vector of the camera (h = 180°, p = 45°), solar coordinates (ψS = 127.2°, γS = 65.2°), and image pixels (u = 190, v = 60, image size 640 × 480 pixels); (b) Proyected solar trayectory in the image, given by the solar coordinates translated to pixels, for June 15th, at 12h50.

Download Full Size | PDF

In Fig. 3, black lines represent the solar coordinates (sunpath) along one year. For a pitch of 0° and south oriented (heading = 180°), only half of the view is covering the upper hemisphere (blue area); changing to a pitch of 45° more sunpath is covered, and the view is maximized (red area); the rest of the sunpath, not seen by aiming to the south, is covered by modifying heading to east and west (180°±90°). Lower solar elevations are not energetically relevant and in urban scenarios are almost always covered by the cityscape.

 figure: Fig. 3.

Fig. 3. Sunpath of Madrid (black lines) in cylindrical projection. In blue, the portion of view defined by a camera pointing as in Fig. 1(a), showing that only half of the view is covering the upper hemisphere; in red changing pitch to 45° (Fig. 1(b)) so more sunpath is covered, and the view is maximized; and in green heading is aiming east and west (Fig. 1(c) and (d)) to cover the rest of the sunpath not seen by aiming to the south. Lower solar elevations are not energetically relevant and in urban scenarios are almost always covered by the cityscape.

Download Full Size | PDF

2.1.2 Shadow profile from image processing based on masks

To detect obstacles blocking direct sunlight, the sky must be distinguished from other objects such as buildings or vegetation in the image. There are basically two main approaches: threshold-based algorithms and machine learning-based algorithms [26]. While the first are simple to implement and require little computational load, those based on machine learning, although they may have a better response, require training based on a large previous database [19,27,28].

To validate the concept we have implemented a threshold-based algorithm with a mask using OpenCV [29], a Computer Vision library. A series of masks are applied to the GSV photographs for sky identification, based on hue search in various color models, contour and edge detection, along with subsequent smoothing to diminish poorly processed pixels. Given the geographic coordinates of any location, the evolution of the solar trajectory over a given day can be plotted over the images of that location, as seen in Fig. 4.

 figure: Fig. 4.

Fig. 4. Obstacle’s maks example (a) from a GSV image (b), differentiating sky from the rest of objects that block the sunlight. Red dots show the sunpath for a given day, different instants of time, as seen from the camera’s point of view.

Download Full Size | PDF

At a given moment, the sunny or shady condition at the site is determined by the intersection of the solar trajectory with buildings, urban furniture, or vegetation. For example, in Fig. 4, the vehicle parked in the position where the GSV image is taken, is in the shade from dawn until around 12:25.

2.1.3 Estimation of the SVF

The SVF calculation requires the image of the location pointing to the sky (pitch = 90°) and, at most, four additional images of the cardinal points, with FOV = 90°. From the rectilinear images a hemispherical equisolid angle projection can be obtained, where the image will retain the same area over any viewing angle. The SVF will be then obtained from counting the sky area from the total, as shown in Fig. 5.

 figure: Fig. 5.

Fig. 5. (a) Representation of the location of Fig. 1(a) in a cubemap, an unfolded cube with 6 views: (p = 0,h = 0), (p = 0,h = 90), (p = 90,h = 0), (p = 0,h = 180), (p = 0,h = 270), (p = −90,h = 0, not required, shown for illustrative purposes only) and (b) the corresponding hemispherical transformation. The SVF in this case is 0.63.

Download Full Size | PDF

2.1.4 Modeled irradiance on a driving route

The calculation of solar irradiance in the plane of the vehicle requires estimating the global irradiance (G) at each position and time of the trajectory. To do this, the direct (B) and diffuse (D) components must be estimated, the latter in turn consisting of circumsolar and isotropic irradiance. In addition, to account for the effect of shading on the vehicle, an estimated shading factor SF has to be considered. The SF is defined as a boolean value that is zero in case of shading or one if sunlight is not blocked at the time considered.

The corresponding SF is applied to the direct and circumsolar diffuse components for all intermediate points along the route to calculate the solar irradiance on the vehicle (Groute,s). On the other hand, the isotropic diffuse component arriving from the sky vault will be blocked by the set of the obstacle profile. The sky view factor, SVF, accounts for the portion of visible hemisphere discounting the part hidden by obstacles.

$${\textrm{G}_{route\_s}} = \mathop \smallint \nolimits_{origin}^{destination} [{({B(t )+ {D_{circumsolar}}(t )} )SF(t )+ {D_{isotropic}}(t )SVF(t )} ] \cdot dt\; [{W\; {m^{ - 2}}\; h} ]$$

One way to estimate the degree of shading of a given route is to consider the irradiance ratio with respect to the same route without taking into account the associated urban obstacles (SF = 1 for all the moments of the route), Groute_ns. So, we can define the dynamic urban shading factor (DUSFroute) of a route as

$$\textrm{DUS}{\textrm{F}_{route}} = \frac{{{\textrm{G}_{route\_s}}}}{{{\textrm{G}_{route\_ns}}}}\; [\%]$$
The DUSFroute allows estimate the efficiency of a route in terms of collecting of solar energy. In addition, if two routes have similar arrival times, it allows comparing which is more convenient for electricity generation.

3. Results

3.1 Validation of sunpath projection on photographs

To validate the proposed procedure, the actual Sun position has been compared with the obtained sunpath based on the processing of a GSV image. To account for the actual Sun position, a set of photographs have been taken during time intervals of one particular day at a specific site, the parking lot of the Institute of Solar Energy in Madrid (lat = 40.45374°, long = −3.72687°). Subsequently, the sunpath has been projected on the images as described in the previous section to see if the Sun position estimation is coincident with the one in the image. As the proposed procedure is expected to be used with GSV, the used photographs reproduce the same scenery than the one given by GSV image for the same geographical location. For doing this, the images are taken with the camera at the same height (2.5 m) than GSV does and selecting the highest FOV available from the camera (96°) to cover the maximum scene in the image. To adjust elevation (pitch) and orientation (heading) in the camera, a side-by-side comparison is performed as shown in Fig. 6, setting the same FOV in GSV.

 figure: Fig. 6.

Fig. 6. (a) Image from GSV service at experimental point, (b) image taken from the same point with a camera, with the same height (2.5 m), heading (180°) and pitch (0°) resulting the same objects with the same pixel coordinates. FOV of the camera (96°) has been previously obtained and then set in GSV API.

Download Full Size | PDF

With the camera in static mode, photographs are taken at different times to capture the sunpath, in this case on February 23rd. The images are overlapped to show the sunpath together with the path estimation (dots in red) obtained following the procedure described above, as presented in Fig. 7, with a maximum deviation between the Sun’s position and the projected sunpath of 3°. The experiment has been repeated based on images acquired with different orientation, tilt and FOV of the camera, with the same results as presented in Fig. 7.

 figure: Fig. 7.

Fig. 7. (a) Prove of the procedure to translate angles to image pixels on the same image parameters from Fig. 6(b) during several moments to capture the actual sunpath while the calculated solar trajectory is overlaid with red dots. (b) Pitch increased to 41° proving that the procedure works at different heading and pitch angles.

Download Full Size | PDF

Using the FOV together with the orientation and pitch of the GSV camera, it is possible to determine which image contains the solar coordinates that need to be evaluated. Figure 8(a) shows Sun trajectories of four different days (February 23rd, solstices, and equinoxes) and shows the positions of the Sun at four different times on February 23rd. The complete sunpath is represented in Fig. 8(b) together with the portion of view of the camera with the same pointing angles as was introduced in Fig. 6(b).

 figure: Fig. 8.

Fig. 8. (a) Photography taken on February 23 from the same GSV point analyzed in Fig. 6, but changing both pitch (25°) and heading (215°) and overlapping sunpath of the day and solstices & equinox. (b) the same sunpaths on cylindrical projection plus the portion of view defined by the camera at those angles.

Download Full Size | PDF

It can be seen that if it is necessary to overlap the solar path of the whole year, the process of obtaining the shadow profile of a location can be done by sweeping through several images that include the solar path from sunrise to sunset of the whole year, as shown in Fig. 8(b), where sunpaths on cylindrical projection are plotted for the point of study plus the portion of view defined by the camera sweeping heading (90°, 180° and 270°≡−90°) and pith fixed at 45°. For elevations higher than 10°, the annual sunpath is then covered by only 3 images with FOV = 90°.

3.2 Application example: most energetic route calculation

To obtain the routes, the Google Maps (GM) service has been used through its API. In this way a list of points between origin and destination is obtained, with the peculiarity that the points given by GM are indications of when to change direction, so they are not equally spaced. However, in order to calculate the integrated irradiance during the trajectory according to the formula presented in Eq. (1), the segments of the trajectory are interpolated so that points every 10 m are considered to estimate the irradiance. The exact position within the road in this experiment is fixed by where the GSV service took the closest photograph but taking into account that in average these photographs are spaced every 10 m too.

In this example, some considerations are performed when implemented the proposed methodology. First, a constant speed is assumed although a more realistic approach would include real velocity considering stops mainly due to traffic lights and the exact position of the vehicle at each instant. Second, to model the irradiance, for simplification, it has been decided to take 0.50 for the SVF during the whole route, a typical value for urban environments [22].

As an example, and possible use case of the estimation of the irradiance of a route, we can see the scenario in which different alternative trajectories are presented to go from the same origin to the same destination in the city of Madrid as shown in Fig. 9. Specifically, two different trajectories (blue and red) are shown in the city of Madrid, on May 13 at 11:50 am, with identical origin and destination, and which have been chosen among the various options because they both last 16 minutes, even if there are differences in length or waiting times. The fact that they last the same time is important because we are going to compare irradiations.

 figure: Fig. 9.

Fig. 9. Urban route between two points in Madrid with two possible trajectories (blue and red). These trajectories have been selected because they have the same estimated time and are therefore energetically comparable, although with different DUSF. Source: Google Maps

Download Full Size | PDF

Taking into account the metric defined in Eq. (2) (dynamic urban shading factor, DUSFroute), the blue DUSFroute is 51% while red DUSFroute is 65%, meaning that red option, while it takes the same time, yields more irradiation than the blue one.

It takes 10 to 15 seconds to process each image (which is a 10-meter run) on a 2020 office desktop PC, where the mask calculation is the bottleneck. This is a very conservative approach based on a proof of concept developed in Python language that has not been optimized for speed.

4. Conclusions

The estimation of the solar resource is fundamental for a PV system and becomes especially laborious for VIPV systems as they are mobile systems. In an urban environment it is even more relevant due to 1) the large number of obstacles that cause shading and 2) the shading profile changes as the vehicle moves. Therefore, for the calculation of irradiance on an urban route it is required to be able to establish for a place and time whether it will block solar radiation. This is where street images can offer great support due to the large coverage of cities they offer. Based on street image mapping services, the proposed method places the solar path of any day of the year on a GSV image while detecting the profile of obstacles so that jointly it can be known at each location and instant if the Sun is shaded.

A new procedure has been established to obtain the irradiation during an urban route. For this purpose, masks of the sky in GSV images are obtained to determine when the Sun is blocked at a given point and time. Thanks to this, a uniformly interpolated route can be obtained and the irradiation along this route can be determined. When insolation is considered in the case of urban routes for a VIPV, we can suggest that it influences the route chosen, as some may provide more energy for the vehicle than others. Solar irradiation could then be added to relevant parameters such as traffic, arrival time or journey length when planning an urban route.

The whole procedure is also valid to estimate routes in advance using any image source other than GSV type services such as own measurements considering that the images must be geolocalized and considering the FOV of the camera used. Some tuning of the mask algorithms is to be expected due to different levels of brightness and hue.

This study presents a proof-of-concept procedure, consequently there are some points that can improve the accuracy of the method as well as others that require greater depth:

  • 1. The sampling frequency along a route, as discussed in Methodology, is in fact an open question in the VIPV community. Considering the limitation of imaging every 10 m of the GSV service and an urban speed of 50 km/h (13.88 m/s), the sampling frequency is close to 1 image per second.
  • 2. The complete calculation of the isotropic diffuse irradiance requires that it is also estimated from the mask over the images, although in this case it is applied over the corresponding set of images spanning 360°.
  • 3. The calculation of masks for obstacle detection is not a trivial matter as it generates both false positives and negatives and can take considerable time depending on the method applied. The use of machine learning in other similar studies, specifically neural networks should improve this process.

Funding

Ministerio de Ciencia e Innovación (MCIN/AEI/ 10.13039/501100011033, PID2021-128853OB-I00).

Acknowledgments

This work has been supported by the project DETEC-PV, Grant PID2021-128853OB-I00 funded by MCIN/AEI/ 10.13039/501100011033 and, by “ERDF A way of making Europe”. We acknowledge open-source libraries pvlib-python [30] and OpenCV [29].

Disclosures

The authors declare no conflicts of interest.

Data availability

Code and data underlying the results presented in this paper are available in Code 1 Ref. [31].

References

1. “Eurostat - Passenger mobility statistics,” https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Passenger_mobility_statistics.

2. O. Kanz and B. Lim, Vehicle-Integrated Photovoltaics (VIPV) as a Core Source for Electricity in Road Transport, (European Technology and Innovation Platform for Photovoltaics (ETIP), 2020).

3. M. Heinrich, “Potential and Challenges of vehicle integrated photovoltaics for passenger cars,” (2020).

4. K. Araki, L. Ji, G. Kelly, and M. Yamaguchi, “To Do List for Research and Development and International Standardization to Achieve the Goal of Running a Majority of Electric Vehicles on Solar Energy,” Coatings 8(7), 251 (2018). [CrossRef]  

5. J. Macias, R. Herrero, I. Anton, and R. Nuñez, “Evaluation of the Solar Resource and Energy Generation in Vehicle Integrated Photovoltaics,” in 37th European Photovoltaic Solar Energy Conference and Exhibition (WIP, 2020), pp. 1927–1930.

6. M. Centeno Brito, T. Santos, F. Moura, D. Pera, and J. Rocha, “Urban solar potential for vehicle integrated photovoltaics,” Transportation Research Part D: Transport and Environment 94, 102810 (2021). [CrossRef]  

7. K. Araki, Y. Ota, A. Nagaoka, and K. Nishioka, “3D Solar Irradiance Model for Non-Uniform Shading Environments Using Shading (Aperture) Matrix Enhanced by Local Coordinate System,” Energies 16(11), 4414 (2023). [CrossRef]  

8. K. Araki, Y. Ota, A. Maeda, M. Kumano, and K. Nishioka, “Solar Electric Vehicles as Energy Sources in Disaster Zones: Physical and Social Factors,” Energies 16(8), 3580 (2023). [CrossRef]  

9. D. G. Steyn, “The calculation of view factors from fisheye-lens photographs: Research note,” Atmos.-Ocean 18(3), 254–258 (1980). [CrossRef]  

10. I. García, M. de Blas, B. Hernández, C. Sáenz, and J. L. Torres, “Diffuse irradiance on tilted planes in urban environments: Evaluation of models modified with sky and circumsolar view factors,” Renewable Energy 180, 1194–1209 (2021). [CrossRef]  

11. J. Bernard, E. Bocher, G. Petit, and S. Palominos, “Sky View Factor Calculation in Urban Context: Computational Performance and Accuracy Analysis of Two Open and Free GIS Tools,” Climate 6(3), 60 (2018). [CrossRef]  

12. G. Wetzel, L. Salomon, J. Krügener, D. Bredemeier, and R. Peibst, “High time resolution measurement of solar irradiance onto driving car body for vehicle integrated photovoltaics,” Progress in Photovoltaics: Research and Applications n/a, (2021).

13. F. Biljecki and K. Ito, “Street view imagery in urban analytics and GIS: A review,” Landscape and Urban Planning 215, 104217 (2021). [CrossRef]  

14. “Google Street View,” https://www.google.com/streetview/.

15. “Look Around (Apple) - Wikipedia,” https://en.wikipedia.org/wiki/Look_Around_(Apple).

16. “Mapillary,” https://www.mapillary.com/.

17. “KartaView,” https://kartaview.org/landing.

18. Y. Kang, F. Zhang, S. Gao, H. Lin, and Y. Liu, “A review of urban physical environment sensing using street view imagery in public health studies,” Annals of GIS 26(3), 261–275 (2020). [CrossRef]  

19. X. Li, C. Zhang, W. Li, R. Ricard, Q. Meng, and W. Zhang, “Assessing street-level urban greenery using Google Street View and a modified green view index,” Urban Forestry & Urban Greening 14(3), 675–685 (2015). [CrossRef]  

20. E. Díaz and H. Arguello, “An algorithm to estimate building heights from Google street-view imagery using single view metrology across a representational state transfer system,” in Dimensional Optical Metrology and Inspection for Practical Applications V (SPIE, 2016), Vol. 9868, pp. 8–5.

21. S. C. Government of Canada, “An open-source system for building-height estimation using street-view images, deep learning, and building footprints,” https://www150.statcan.gc.ca/n1/pub/18-001-x/18-001-x2020002-eng.htm.

22. J. Liang, J. Gong, J. Sun, J. Zhou, W. Li, Y. Li, J. Liu, and S. Shen, “Automatic Sky View Factor Estimation from Street View Photographs—A Big Data Approach,” Remote Sens. 9(5), 411 (2017). [CrossRef]  

23. A. Matzarakis and O. Matuschek, “Sky view factor as a parameter in applied climatology rapid estimation by the SkyHelios model,” Meteorologische Zeitschrift 20(1), 39–45 (2011). [CrossRef]  

24. D. Anguelov, C. Dulong, D. Filip, C. Frueh, S. Lafon, R. Lyon, A. Ogale, L. Vincent, and J. Weaver, “Google Street View: Capturing the World at Street Level,” Computer 43(6), 32–38 (2010). [CrossRef]  

25. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000). [CrossRef]  

26. Y. Karout, S. Thil, J. Eynard, and S. Grieu, “A supervised deep learning model for cloud/sky image segmentation,” 11 (2020).

27. C.-W. Wang, J.-J. Ding, and P.-J. Chen, “An efficient sky detection algorithm based on hybrid probability model,” in 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA) (IEEE, 2015), pp. 919–922.

28. K. A. Nice, J. S. Wijnands, A. Middel, J. Wang, Y. Qiu, N. Zhao, J. Thompson, G. D. P. A. Aschwanden, H. Zhao, and M. Stevenson, “Sky pixel detection in outdoor imagery using an adaptive algorithm and machine learning,” Urban Climate 31, 100572 (2020). [CrossRef]  

29. I. Culjak, D. Abram, T. Pribanic, H. Dzapo, and M. Cifrek, “A brief introduction to OpenCV,” in 2012 Proceedings of the 35th International Convention MIPRO (2012), pp. 1725–1730.

30. W. F. Holmgren, C. W. Hansen, and M. A. Mikofski, “pvlib python: a python package for modeling solar energy systems,” Journal of Open Source Software 3(29), 884 (2018). [CrossRef]  

31. R. Nunez and A. Muñoz, “Estimation of Solar Resource for Vehicle-Integrated Photovoltaics in Urban Environments Using Image-Based Shade Detection,” Github (2023) https://github.com/isi-ies-group/estimation_solar_VIPV.

Supplementary Material (1)

NameDescription
Code 1       Code repository

Data availability

Code and data underlying the results presented in this paper are available in Code 1 Ref. [31].

31. R. Nunez and A. Muñoz, “Estimation of Solar Resource for Vehicle-Integrated Photovoltaics in Urban Environments Using Image-Based Shade Detection,” Github (2023) https://github.com/isi-ies-group/estimation_solar_VIPV.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Images taken from GSV at parking of the Solar Energy Institute, Madrid (Spain) with FOV = 90°. (a) pointing to the south, heading = 180°and pitch = 0°, (b) pointing to the south, heading = 90°and pitch = 45°, (c) pointing to the east, heading = 90° and pitch = 45°, (d) pointing to the west, heading = -90°, and pitch = 45°
Fig. 2.
Fig. 2. (a) Schema of the image coordinates as seen by the camera (origin), pointing vector of the camera (h = 180°, p = 45°), solar coordinates (ψS = 127.2°, γS = 65.2°), and image pixels (u = 190, v = 60, image size 640 × 480 pixels); (b) Proyected solar trayectory in the image, given by the solar coordinates translated to pixels, for June 15th, at 12h50.
Fig. 3.
Fig. 3. Sunpath of Madrid (black lines) in cylindrical projection. In blue, the portion of view defined by a camera pointing as in Fig. 1(a), showing that only half of the view is covering the upper hemisphere; in red changing pitch to 45° (Fig. 1(b)) so more sunpath is covered, and the view is maximized; and in green heading is aiming east and west (Fig. 1(c) and (d)) to cover the rest of the sunpath not seen by aiming to the south. Lower solar elevations are not energetically relevant and in urban scenarios are almost always covered by the cityscape.
Fig. 4.
Fig. 4. Obstacle’s maks example (a) from a GSV image (b), differentiating sky from the rest of objects that block the sunlight. Red dots show the sunpath for a given day, different instants of time, as seen from the camera’s point of view.
Fig. 5.
Fig. 5. (a) Representation of the location of Fig. 1(a) in a cubemap, an unfolded cube with 6 views: (p = 0,h = 0), (p = 0,h = 90), (p = 90,h = 0), (p = 0,h = 180), (p = 0,h = 270), (p = −90,h = 0, not required, shown for illustrative purposes only) and (b) the corresponding hemispherical transformation. The SVF in this case is 0.63.
Fig. 6.
Fig. 6. (a) Image from GSV service at experimental point, (b) image taken from the same point with a camera, with the same height (2.5 m), heading (180°) and pitch (0°) resulting the same objects with the same pixel coordinates. FOV of the camera (96°) has been previously obtained and then set in GSV API.
Fig. 7.
Fig. 7. (a) Prove of the procedure to translate angles to image pixels on the same image parameters from Fig. 6(b) during several moments to capture the actual sunpath while the calculated solar trajectory is overlaid with red dots. (b) Pitch increased to 41° proving that the procedure works at different heading and pitch angles.
Fig. 8.
Fig. 8. (a) Photography taken on February 23 from the same GSV point analyzed in Fig. 6, but changing both pitch (25°) and heading (215°) and overlapping sunpath of the day and solstices & equinox. (b) the same sunpaths on cylindrical projection plus the portion of view defined by the camera at those angles.
Fig. 9.
Fig. 9. Urban route between two points in Madrid with two possible trajectories (blue and red). These trajectories have been selected because they have the same estimated time and are therefore energetically comparable, although with different DUSF. Source: Google Maps

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

G r o u t e _ s = o r i g i n d e s t i n a t i o n [ ( B ( t ) + D c i r c u m s o l a r ( t ) ) S F ( t ) + D i s o t r o p i c ( t ) S V F ( t ) ] d t [ W m 2 h ]
DUS F r o u t e = G r o u t e _ s G r o u t e _ n s [ % ]
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.