Cloud Compare with Python automation
Posted: Fri May 16, 2025 5:10 pm
Hi,
I'd like to build a little interface that will load a bunch of LAS files into CC, process them and save out files. Pretty straightforward.
I'd like to control it with Python and use multiple instances of CC running concurrently, with Python feeding files to the instances as they complete their processing of the individual LAS files... If i can get it to work for a single instance of CC, then I'd like to expand it to multiprocessing.
it may be helpful for best assessment to review the specific workflow I hope to achieve, and to determine if these can be accomplished perhaps through the plugin, CLI, or the CloudComPy interface (which I have found VERY difficult to build for use). CloudComPy requires Qt5 which now does not provide some required dependencies, and I have run into days of wasted time trying to build the CloudComPy tool. It seems broken to me. Or maybe it broke me! I have temproarily quit trying to build CloudComPy with CMake and Visual Studio, it was hellish.... anyway my goal is pretty simple and I could do this all by hand but automation by software control seems sensible - so seeking a howto that actually works is my state now.
First, Open the next file.
Then in the Open LAS file window/ Loading/Standard Fields/ Select All/ Apply.
Then in the Global Shift/Scale window, click Yes.
Then in the DB Tree window, select the lowest child element in the DB Tree.
Then in the Properties window, under Scalar Fields (note: NOT under CC Object/ Scalar field !!), set Scalar Fields/ Active to Classification.
Then activate the Min/Max tool (also known as the Filter points by value tool) and set its Range to be 1.1 to 2.1 and select Export.
Then ensure that the newly filtered point cloud (lowest child object in the DBtree is selected.
Then export/save the filtered point cloud with a proper name to a proper lcation.
Then with the filtered point cloud still selected in the DB Tree, activate the Tools/ Projection/ Rasterize (and contour plot) tool (also known as the Convert a cloud to 2D raster tool).
Then, in the Rasterize tool set step to desired setting (let's say 1.0 to get started to save time during testing, we can make an .ini file to make that setting be user-accessible, we will want to set it during production to .5, .3, or other setting), active layer= Cell height values, Projection direction= Z, cell height= median, Std. dev. layer= Intensity, project SF's= ON and median value selected, resample point cloud= ON, Empty cells/ Fill with= let's start with Interpolation for getting started, (we will use kriging during production, this can be user- selectable, like step height, in the .ini file).
Then it should Activate the Update Grid function, which will open the Grid Size window. Select Yes to that window.
Then observe for completion of the updating of the grid.
Then when the grid is updated, in the Export tab of the Rasterize tab, select Raster, name the file properly and set its Save location properly and save it.
Therefore the processing of each file would produce two derivative files; the first would be an LAS file of the filtered point cloud, the second would be a TIF raster of the rasterized filtered point cloud.
Is there anyone who could help me with this in any way? Suggestions, collaboration, recommendations for solutions??
Should I try building CloudComPy with Docker or Conda instead of CMmake/VisualStudio?
I spent time building a batch file but learned it cannot control the tools inside CC, so I need something that can do that. The Plugin maybe?
I thought external control would be most flexible and extensible, but CloudComPy is a grizzly bear to build, so I am open to new ideas.
Thank you,
-benb
I'd like to build a little interface that will load a bunch of LAS files into CC, process them and save out files. Pretty straightforward.
I'd like to control it with Python and use multiple instances of CC running concurrently, with Python feeding files to the instances as they complete their processing of the individual LAS files... If i can get it to work for a single instance of CC, then I'd like to expand it to multiprocessing.
it may be helpful for best assessment to review the specific workflow I hope to achieve, and to determine if these can be accomplished perhaps through the plugin, CLI, or the CloudComPy interface (which I have found VERY difficult to build for use). CloudComPy requires Qt5 which now does not provide some required dependencies, and I have run into days of wasted time trying to build the CloudComPy tool. It seems broken to me. Or maybe it broke me! I have temproarily quit trying to build CloudComPy with CMake and Visual Studio, it was hellish.... anyway my goal is pretty simple and I could do this all by hand but automation by software control seems sensible - so seeking a howto that actually works is my state now.
First, Open the next file.
Then in the Open LAS file window/ Loading/Standard Fields/ Select All/ Apply.
Then in the Global Shift/Scale window, click Yes.
Then in the DB Tree window, select the lowest child element in the DB Tree.
Then in the Properties window, under Scalar Fields (note: NOT under CC Object/ Scalar field !!), set Scalar Fields/ Active to Classification.
Then activate the Min/Max tool (also known as the Filter points by value tool) and set its Range to be 1.1 to 2.1 and select Export.
Then ensure that the newly filtered point cloud (lowest child object in the DBtree is selected.
Then export/save the filtered point cloud with a proper name to a proper lcation.
Then with the filtered point cloud still selected in the DB Tree, activate the Tools/ Projection/ Rasterize (and contour plot) tool (also known as the Convert a cloud to 2D raster tool).
Then, in the Rasterize tool set step to desired setting (let's say 1.0 to get started to save time during testing, we can make an .ini file to make that setting be user-accessible, we will want to set it during production to .5, .3, or other setting), active layer= Cell height values, Projection direction= Z, cell height= median, Std. dev. layer= Intensity, project SF's= ON and median value selected, resample point cloud= ON, Empty cells/ Fill with= let's start with Interpolation for getting started, (we will use kriging during production, this can be user- selectable, like step height, in the .ini file).
Then it should Activate the Update Grid function, which will open the Grid Size window. Select Yes to that window.
Then observe for completion of the updating of the grid.
Then when the grid is updated, in the Export tab of the Rasterize tab, select Raster, name the file properly and set its Save location properly and save it.
Therefore the processing of each file would produce two derivative files; the first would be an LAS file of the filtered point cloud, the second would be a TIF raster of the rasterized filtered point cloud.
Is there anyone who could help me with this in any way? Suggestions, collaboration, recommendations for solutions??
Should I try building CloudComPy with Docker or Conda instead of CMmake/VisualStudio?
I spent time building a batch file but learned it cannot control the tools inside CC, so I need something that can do that. The Plugin maybe?
I thought external control would be most flexible and extensible, but CloudComPy is a grizzly bear to build, so I am open to new ideas.
Thank you,
-benb