Install from repository:
pip install -e "git+https://github.com/Denikozub/Lawn_paths.git#egg=lawn_paths"
Warning: package requires GeoPandas to be installed, which can be problematic on Windows. This article may help.
docker run -d -it —name final —mount type=bind,source="$(pwd)"/target,target=/app kadmus_map
target/ - TIF and TFW files directory
Docker link
Documentation
The program builds a shapefile based on the results of the neural network - NPY image masks and corresponding TFW files of the world.
It is possible to filter the results, smooth the final lines and choose the coordinate system.
Filtration:
- By width
- According to the distance between trails
- According to the size of the bounding box
- By area
Additionally, it is possible to efficiently (6 seconds on test data) check intersections with buildings in Moscow to reduce the likelihood of errors. Data downloaded from open source OpenStreetMap, processed and compressed.
Documentation
Implemented the ability to plot found trails on an interactive map OpenStreetMap using the open source service Leaflet. This solution does not require paid API of such services as Google Maps or Yandex Maps and can be used in commercial projects.
Documentation
An iterative approach to data labeling using deep learning has been developed and implemented:
- Mark up some data manually
- Train the neural network on labeled data
- Apply a neural network to help label the next piece of data
- Go to step 1
python visual_build/main.py
Run using command prompt
python preliminary_markup/pipeline.py get_mask img.tif
Returns a mask where the areas found by the neural network are highlighted in red on a white background. The mask is stored in img_mask.tif
It remains to erase the extra red marks, then convert the mask to .npy; command to convert:
python preliminary_markup/pipeline.py get_npy img_mask.tif
Red is replaced by white, everything else is black.
Command to view the result of applying a mask and an image:
python preliminary_markup/pipeline.py blend image.tif mask.tif
For each next stage of training the neural network, it is necessary to update the pipeline.
from lawn_paths.map_builder.shapefile import build_shapefile
build_shapefile(dataset_directory: str,
file_list: list = None,
buildings_file: str = "Moscow_Buildings.zip",
output_filename: str = 'paths.shp',
crs: str = 'epsg:32637',
max_path_distance_cm: float = 100.,
max_path_width_cm: float = 80.,
min_bbox_size_m: float = 1.,
max_bbox_size_m: float = 200.,
max_path_area_m2: float = 100.,
p_epsilon: float = 0.3,
c_epsilon: float = 2.)
dataset_directory: directory where .NPY mask files and .TFW world files are contained
file_list: list of filenames to be processed (without extensions)
buildings_file file with buildings (polygons) in epsg:4326
output_filename: name of the output file (should be .SHP)
crs: initial coordinate reference system
max_path_distance_cm: max distance between paths for them to be connected in cm
max_path_width_cm: max path width
min_bbox_size_m: min size of path's bounding box in meters
max_bbox_size_m: maxsize of path's bounding box in meters
max_path_area_m2 max path area in squared meters
p_epsilon: RDP parameter to smooth path polygons
c_epsilon: RDP parameter to smooth path polygon centerlines
from lawn_paths.map_builder.visualize import visualize
visualize(filename: str, output_file: str)
matplotlib==3.3.2 required
filename path to SHP file with paths to visualize
output_file path to HTML file with interactive map