Keep a closer eye on your #MapBrowser projects in one place with project marker lists showing all your added locations. 👀 In each marker list, you can add specific attributes – creating a shareable dataset about any details relating to a particular location or property. You can also filter the marker list, and download the data as a CSV file. Do more with MapBrowser 👉 https://lnkd.in/gw65ij2b #NearmapNextUp #LocationIntelligence #GIS
Nearmap’s Post
More Relevant Posts
-
📹Building a RAG application using open-source models ❓Asking questions from a PDF using Llama2❓ Awesome tutorial on how to connect LLMs to your data Video: https://lnkd.in/gD73FYDD Code: https://lnkd.in/gxbHph8c
To view or add a comment, sign in
-
🚀 Just Wrapped Up a SQL Project: Excited to share the outcome of my latest SQL project with you! 📊💡 Thanks to the legend Alex Freberg for the guidance. Check it out here: https://lnkd.in/djTBSQAV 🧹 Data Cleanup Galore: Ever wondered what goes into tidying up messy data? Here's a sneak peek at what I've been up to: Making sure all data plays nice together by standardizing it. Giving missing columns some love and attention. Saying goodbye to pesky duplicates. Hunting down and fixing those sneaky typos. Giving NULL values a new lease on life. Breaking down big columns into bite-sized chunks. Giving the boot to any unwanted columns. Stay tuned for more details on this adventure! #DataCleanup #SQLProject #DataWrangling
To view or add a comment, sign in
-
Let's enhance our SQL skills by diving into the application of the 'GROUP BY' keyword. Imagine having a sales table with columns like product_id, product_category, quantity_sold, and price. The goal? To compute the total sales for each product category. The magic happens with the query: ```SQL SELECT product_category, SUM(quantity_sold * price) AS total_sales FROM sales GROUP BY product_category; ``` #SQL #GroupBy #DataAnalysis #CodingSkills
To view or add a comment, sign in
-
Unlock the power of web scraping in Power BI. Join our Founder Adeiza Sulieman tomorrow on our practical masterclass and learn to build, deploy, and automate your first web scraping project. See you at 8PM GMT+1 ! Click on the link bit.ly/44jnWo1 to secure your spot promptly. #10alytics #dataanalytics
To view or add a comment, sign in
-
#PowerBI July update is out. Looks like there are some improvements to #AzureMaps reference layers 👉CSV Support: The new support for CSV files as data sources for reference layers makes it easier to integrate various data formats, alongside existing options like GeoJSON, Shapefiles, WKT, and KML files. 👉 Enhanced Customization: The ability to format reference layer shapes directly within the formatting pane. You can now customize the color and width of points, lines, and polygons without modifying your reference layer files but there is no conditional formatting yet 😢 👉 Dynamic URL Sources: Adding dynamic URL support using conditional formatting is impressive. It allows for dynamic loading of different reference layers based on data-bound conditions, such as slicer selections, enhancing the interactivity of your reports. Ring that 🔔 to stay up to date with all things #PowerBI.
To view or add a comment, sign in
-
Hello there, CLICK here if you are interested in -analysis ready- monthly - Sicily Region precipitation data: https://lnkd.in/dHMiHihk -- first of all: I will write a more formal and comprehensive report on what I have done here, and what I am willing to do with this repo I have created with the hope we can facilitate works related to monitoring Sicily ongoing drought. GOAL: Here is a first intro to my current work/practice, which works both as a way to implement my portfolio and an open call for everyone who wish to help me (voluntary) in this effort to make Sicily (HOME) precipitation data ready for geo/data analysis in an open and free environment like in github. - tools: vscode (used locally with Python) and Colab (only to populate the folder 'ee_fc', and in the future to automatically use ee and geemap without install them locally on my PC; enough so far.) - what is it doing? it is 'fetching ONLY precipitation data' via the API (CKAN type) from: https://lnkd.in/dZjc5wT2 (sicily datapage) - we could discuss that only precip from mid-2019 to mid-2022 are present (this is their choice for some reason, NO CLUES WHY); -but you can actually send them a PEC to request historical data (I guess so, and then you can take advantage of what I did in my repo as well, by adding those data; it is up to you to understand what are the license limits you could encounter upon your work); - i aggregated the data as monthly precipitation per agro-meteorological station (you identify by name and ID_STAZ) and then they are merged with an other file (elenco_sensori_meteo_details.json) with station location data (LON, LAT and ALT) ; files are like: 'month-year', with each row equal to a sias stations with the precipitation value of that month-year. the repo in brief ( I thought of structuring it basically on the type of files you may need; For instance, you wanna just compute numbers? You can only use the preprocessed csv files; you need to perform a geospatial work? use the geoJSON files and so on: - /datasets/: contains the original csv files in the 'sicily datapage' retrieved with the API (CKAN), look at the script 'fetch_sicily_data.py'; - /preprocessed_datasets/ : has the 'cleaned' files, because sensors give hourly data, but for the kind of further analysis one can do (for example SPI, and also to make it lighter) data are AGGREGATED monthly (my decision) starting from hourly precip; clearly each monthly total (for eg. 03-2021) conserves this value for each of the SIAS agro-meteorological stations; - /geoDataframes/ and '/ee_fc/' are the most important; because here you will find these precipitation data (divided by year_folder) with geometry (as point locations with long and lat) and ready to enter your QGIS or python script or you can easily export the json files in 'ee_fc' as asset ee.featureCollections; #sicilia #sias #precipitation #python #github #open #free #data #geojson #geopandas
GitHub - fener95/sicily-precip
github.com
To view or add a comment, sign in
-
In today's article, I will show you two ways to enrich one table with data from another in MapInfo Pro. The first is fine when updating a single column. The second is better if you update multiple columns, saving you time. Happy reading and happy #MapInfoMonday! #mapinfopro #enrich #spatialanalytics
MapInfo Monday: Two Ways to Update a Table with Values from another Table
community.precisely.com
To view or add a comment, sign in
-
Iain Paton, check out the TempPathnameCreator for a workflow like this, as it might give you an alternative option to consider. It creates a temp path that you can use to write some data to (effectively staging it to disk) and then you can read it back and move it on. A useful technique if you're creating a local copy before say passing it up to an FTPCaller or a DropboxConnector or similar. The beauty of it is when the Workspace completes, FME cleans up the entire temp path, so you don't end up with duplicates left behind. https://lnkd.in/derjssDx
I Forgot FME Friday last week (had a university deadline) so here is some Midweek (data) Manipulation. For those who are lazy or efficient - write your features out, read them in again in a slightly different way, and cover up your tracks by deleting the file if necessary (or just overwrite it each time if it is not appended, eg for CSV). The System Caller does the deletion. I've used this a few times, ESRIJSON to GeoJSON a while ago and more recently for changing the column (attribute) names in CSV to the contents of the first row. There are always other ways to do this sort of thing but this is just so easy.
To view or add a comment, sign in
39,819 followers