GeoRabble Perth: An evening to share geo-ideas without sales pitches
GeoRabble is about celebrating the every day challenges and triumphs of working with location. Everything from the mundane to the glamorous, whatever gets your GeoRocks off.
We are a group of geo-obsessed people who want to celebrate the real work done by other geo-obsessed people, unfiltered by professional bodies, government and private company agendas and industry politics.
Anything to do with GeoTech, GeoDev, GeoBusiness, GeoTrends, GeoFutures, – you name it, as long as you’re passionate and want to share your challenges, triumphs, frustrations and pride in the work that you do. We have active GeoRabble groups in Sydney, Melbourne and now Perth.
I have a tabel (Vulcan Block Model, GeoSoft Target Voxcels, XYZ Tables, Surpac String Files, etc) . I just want to get the 3d values into ArcGIS 3D Analyst so I can visualise them.
Never fear. There is a simple work flow to get them into ArcGIS
- Import the table to 3d feature class
- Understand the statistical distribution of your values (via histogram)
- Use SQL expressions to excluding unimportant values
- Use SQL expressions to categorise values into groups (a layer for each group)
- Assign the layers 3D Cube Symbols
- Convert the layers with 3D Symbols to Feature class
My data is slow. In particular the draw times are terrible and the processing times are poor. I have so much data that I wonder if there are things I can do at the database end to make it run faster. I don’t have an enterprise solution, I need a desktop solution.
Never fear, there’s a few best practices to optimise your data so it’s smarter and therefore faster:
- Do not use Shapefiles or Personal Access Geodatabase
- Use the File Geodatabase
- Upgrade Your Old File Geodatabase
- Add and Calculate the Spatial Index Grid
- Iterate Compacts to free orphaned lock files & defragment files
- If you don’t edit it, compress it
- Iterate the repair tool on all feature classes
- Use an attribute index where appropriate
- Understand resolution and tolerance
- Standardise your projections and datum’s
- Manage your path names and conventions
Posted in ArcGIS Admin, Geodatabase, Geometry, Shapefile, XY Resolution
Tagged admin, ArcGIS, best practice, Esri, Faster, File Geodatabase, model, model builder, schema locks, Slow, Slow Draw times, speed of geodatabase, upgrade
I just want to get my data from different client supplied vector formats into ArcGIS. I need to know that it’s going to work. Sometimes I find that one particular data set is lying to me. Each time I run an analysis it provides different results on the same process. Other times my one dataset crashes ArcGIS.
Never fear there’s a few basic things you can do to as best practice to ensure your data is clean and reliable.
There are different ideas on how to create geometries between different products. ArcGIS does not quality check data brought in from other theories to ensure it’s in the ArcGIS theory. ArcGIS does it’s best to ensure that it’s own processes do not create ‘unsupported geometries’, however it’s not perfect and frequently does. To ensure clean vector data:
- Understand XY Resolution & XZ Tolerance (see last blog entry)
- Iterate the “Geometry Repair Tool”
- Avoid Shapefiles and use Geodatabase
I just want to move my hard work from a supplied Shapefile to File Geodatabase then into the Enterprise Geodatabase. However the geography is moving around like it’s in the wind. My cadastre now has all sorts of sliver polygons and the attribute table reports different areas and distances to the original data sets.
Never fear there’s a few basic things you can do to begin to mitigate these drift issues.
Each spatial data type stores the geometry differently and each storage method can have vastly different limitations. In addition the projection and datum defined on the spatial data impacts significantly on geometry drift. Assumption and ignorance can be the step mother of many issues. Drift is specifically caused by:
- The underlying spatial datatypes:
- XY Resolution,
- XY Tolerance and;
To mitigate drift:
- Standardise your projection and datum
- Standardise your transformation equations
- Standardise your XY Resolution
- Understand your data’s geometry