Renaming output to reflect name of input using ArcGIS ModelBuilder?
Every quarter I will be replacing the two CSV files at the beginning of the model. I would like the final Excel file and feature class to each have a name that reflects the input CSV file name (please see model below). For the example below, I would like to have them renamed as: 2015Q1, as a reflection of the names of the input CSV files.
Is this possible to do?
I am not using 'iteration', which most of the similar questions are referring to.
I would recommend the Parse Path tool in ModelBuilder. Make sure to select the "Name" parse type. In this (very simplified) model, I included a "workspace" variable that I can call at any point and combine with the name value from the parse path output. The syntax for that would be
%Workspace%\%Value%in the output path parameter. You should also attach a precondition so that the parse path tool is in sync with the rest of the model.
input function in Python 2.7, evaluates whatever your enter, as a Python expression. If you simply want to read strings, then use raw_input function in Python 2.7, which will not evaluate the read strings.
If you are using Python 3.x, raw_input has been renamed to input . Quoting the Python 3.0 release notes,
raw_input() was renamed to input() . That is, the new input() function reads a line from sys.stdin and returns it with the trailing newline stripped. It raises EOFError if the input is terminated prematurely. To get the old behavior of input() , use eval(input())
In Python 2.7, there are two functions which can be used to accept user inputs. One is input and the other one is raw_input . You can think of the relation between them as follows
Consider the following piece of code to understand this better
input accepts a string from the user and evaluates the string in the current Python context. When I type dude as input, it finds that dude is bound to the value thefourtheye and so the result of evaluation becomes thefourtheye and that gets assigned to input_variable .
If I enter something else which is not there in the current python context, it will fail will the NameError .
Security considerations with Python 2.7's input :
Since whatever user types is evaluated, it imposes security issues as well. For example, if you have already loaded os module in your program with import os , and then the user types in
this will be evaluated as a function call expression by python and it will be executed. If you are executing Python with elevated privileges, /etc/hosts file will be deleted. See, how dangerous it could be?
To demonstrate this, let's try to execute input function again.
Now, when input("Enter your name: ") is executed, it waits for the user input and the user input is a valid Python function invocation and so that is also invoked. That is why we are seeing Enter your name again: prompt again.
So, you are better off with raw_input function, like this
If you need to convert the result to some other type, then you can use appropriate functions to convert the string returned by raw_input . For example, to read inputs as integers, use the int function, like shown in this answer.
In python 3.x, there is only one function to get user inputs and that is called input , which is equivalent to Python 2.7's raw_input .
Successful highway infrastructure planning and maintenance requires significant investments in terms of time, human resources, and money. Every year, billions of dollars are spent on maintaining highway infrastructure via new construction projects, road maintenance, and rehabilitation activities (Lee et al. 1996 Zhang et al. 2001). Increasing urbanization has led to a growing demand for highway infrastructure resulting in transportation systems becoming more complex in response to the demand (O’Brien et al. 2012 Podgorski and Kockelman 2006). Consequently, the need to optimally allocate limited resources to maintain and improve the state of transportation infrastructure cannot be overemphasized. These factors among others, significantly affect public funds expenditure on highway infrastructure development, thus drawing increased public scrutiny to budget planning and funds allocation for highway infrastructure (Sanchez 2006).
Today, the critical focus on and the need for efficient management practices in transportation planning and policy are underscored by key federal laws passed in the last three decades. These include the Surface Transportation and Uniform Relocation Assistance Act of 1987, the ‘Intermodal Surface Transportation Efficiency Act of 1991 (ISTEA),’ and the recent ‘Moving Ahead for Progress in the 21 st century Act (MAP-21).’ These laws have exhibited an increasing emphasis on integrated management practices and efficient use of federal funds. They also highlighted the need to invest in transportation systems by looking at the issue from the perspective of economic, socio-cultural, technological, and sustainable systems. Central to these considerations, is a push towards the use of data to drive highway agencies in making more informed decisions concerning highway planning and management (Thill 2000).
Such advancement notwithstanding, the transportation planning process is a continuous and arduous task involving data, models, and users. As part of the planning process, decision-makers need to use a vast collection of data and information to address a number of substantial issues like traffic management, construction scheduling, Right of Way (ROW) acquisition, public communication, and others (Nobrega and O’Hara 2011 Sankaran et al. 2016a Woldesenbet et al. 2015). Highway agencies often collect a plethora of data and information about the nation’s network of highways. The sources of such data vary widely and their forms range from drawings, pictures, maps, tables, text descriptions, to accounts of personal experience (Flintsch et al. 2004). However, highway agencies usually have to deal with fragmented databases, multiple incompatible models, redundant data acquisition efforts, and sub-optimal coordination between the various agencies or departments operating on the same highway facilities (Chi et al. 2013 Ziliaskopoulos and Waller 2000). To compound this problem, multiple independent legacy information systems usually co-exist within the same agency (Chi et al. 2013 Thill 2000).
In spite of huge investments in data collection and archiving efforts, the amount of information and knowledge generated from data repositories are minimal and less supportive of informed decision making. Additionally, both practitioners and researchers have questioned the efficiency of data programs in meeting the needs of users for highway infrastructure planning purposes (Flintsch and Bryant 2006 Woldesenbet et al. 2015). Transportation professionals still face the onerous task of organizing highway data into suitable forms to support decisions concerning highway maintenance, rehabilitation, traffic control, highway monitoring, and projects prioritization. These issues have given rise to a surge in the demand for effective practices and tools that can integrate, manage, and analyze highway data (Parida and Aggarwal 2005).
Over the past two decades, many State Departments of Transportation (DOTs) have explored the use of digital information systems for highway management decision support (Kang et al. 2008 Lee et al. 1996). Accordingly, TxDOT relies on several information systems these include but are not limited to the Pavement Management Information System (PMIS), a Maintenance Management Information System (MMIS) referred to as COMPASS, and the Design and Construction Information System (DCIS). The challenges associated with accessing and leveraging data from multiple information sources highlights the need for an integrated system that can fuse projects data and support applications to aid Maintenance and Rehabilitation (M&R) planning. While TxDOT has made progress to integrate some highway data, there is a lack of an automated process to visualize and integrate data from the individual information systems to better support highway project planning decisions.
The need for such a system has grown for metropolitan districts since they have significantly more lane-miles of on-system highways under their responsibility and consequently more projects in various phases of development and delivery at any given time. The funding mechanism for maintaining, rehabilitating and upgrading the existing system is complex. It has become further complicated since TxDOT’s funding is dependent on revenue from multiple sources with different permissible uses. Moreover, the planning process is fiscally-constrained at the category level the amount of funding available determines the number of projects that can be planned within specific categories. Metropolitan planning organizations’ (MPO) policy boards have responsibility for certain funding categories requiring concurrence from TxDOT. In addition, at any given time, several projects are in various phases of construction. The actual cost of construction can vary from the budgeted costs that affect category funds available going forward. Construction costs can vary from budgeted costs at the time of bidding or throughout the construction phase owing to change orders, unexpected conflicts needing additional right-of-way, costs to relocated existing utilities, and many others. Furthermore, there are instances when existing roadways that were not expected to be rehabilitated within the planning horizon, have to be rehabilitated owing to faster deterioration in condition. This leads to reactive maintenance to maintain safety and pushing lesser priority projects down the list. The combined effect of these factors (and many more) creates a need for an integrated planning process leveraging modern visualization tools which will allow the integration of temporal and spatial data which can be viewed, reviewed, and updated in a dynamic setting.
The rest of the paper is organized as follows. The next section describes the general challenges in performing highway planning tasks and how GIS has been previously used to address some of the identified challenges. In the “Objective and Methodology” section, a research framework based on a case study is presented. Following this, the “Case Study” section points out district-specific barriers to planning tasks and how a GIS-based tool was developed to integrate data from multiple information systems used by the district. A formalized presentation of the benefits of GIS integration and visualization to M&R planning follows. Finally, the paper ends with conclusions on the findings of this study and the directions for future work in the “Conclusions” section.
Governmentjobs.com, Inc. (DBA &ldquoNEOGOV&rdquo and referred to herein as &ldquowe&rdquo, or &ldquous&rdquo) is committed providing transparency regarding use of your information. We collect personal information by lawful and fair means with your knowledge and where appropriate, your consent. We consider information privacy throughout our product lifecycle - from inception, to production, and ongoing support.
The effective data mining of social media has become increasingly recognized for its value in informing decision makers of public welfare. However, existing studies do not fully exploit the underlying merit of big data. In this study, we develop a data-driven framework that integrates machine learning with spatial statistics, and then use it on Xiamen Island, China to delineate urban population dynamic patterns based on hourly Baidu heat map data collected from August 25 to September 3, 2017. The results showed that hot grids are primarily clustered along the main street through the downtown area during working days, whereas cold grids are often observed at the edge of the city during the weekend. The mixed use (of commercial and life services, restaurants and snack bars, offices, leisure areas and sports complexes) is the most significant contributing factor. A new cold grid emerged near conference venues before the Brazil, Russia, India, China, and South Africa Summit, revealing the strong effects of regulations on population dynamics and its evolving patterns. This study demonstrates that the proposed data-driven framework might offer new insights into urban population dynamics and its driving mechanism in support of sustainable urban development.
Soil erodibility mapping using the RUSLE model to prioritize erosion control in the Wadi Sahouat basin, North-West of Algeria
Soil losses must be quantified over watersheds in order to set up protection measures against erosion. The main objective of this paper is to quantify and to map soil losses in the Wadi Sahouat basin (2140 km 2 ) in the north-west of Algeria, using the Revised Universal Soil Loss Equation (RUSLE) model assisted by a Geographic Information System (GIS) and remote sensing. The Model Builder of the GIS allowed the automation of the different operations for establishing thematic layers of the model parameters: the erosivity factor (R), the erodibility factor (K), the topographic factor (LS), the crop management factor (C), and the conservation support practice factor (P). The average annual soil loss rate in the Wadi Sahouat basin ranges from 0 to 255 t ha −1 year −1 , maximum values being observed over steep slopes of more than 25% and between 600 and 1000 m elevations. 3.4% of the basin is classified as highly susceptible to erosion, 4.9% with a medium risk, and 91.6% at a low risk. Google Earth reveals a clear conformity with the degree of zones to erosion sensitivity. Based on the soil loss map, 32 sub-basins were classified into three categories by priority of intervention: high, moderate, and low. This priority is available to sustain a management plan against sediment filling of the Ouizert dam at the basin outlet. The method enhancing the RUSLE model and confrontation with Google Earth can be easily adapted to other watersheds.
This is a preview of subscription content, access via your institution.
Part I: Raster Analysis of Housing Value Surface
1. Launch ArcGIS and add new data layers.
Copy the familiar ./data/eboston05 folder from the class data locker into a local folder such as C: emplab5. In addition, unzip a new file .data/boston96.zip into the same local folder C: emplab5. You should end up with a two folders named 'eboston05' and 'boston96' in your local C: emplab5 folder. This boston96 folder contains all the new shapefiles we need for this exercise plus a personal geodatabase (11.521_boston96_gdb.mdb) that has MS-Access versions of the Postgres tables that we will use. The boston96 folder also contains an ArcMap document, boston96_lab5start.mxd, that gets you started with the familiar shapefiles from 'eboston05' plus the new ones that we will use today. The ArcMap document was saved using relative address so, if you opened it from inside of C: emplab5oston96 and the eboston05 folder is also in C: emplab5 then ArcMap should be able to find the location of all the needed shapefiles.
/***** Alternative ArcMap startup steps *********/
Instead of starting with boston06_lab5start.mxd, we could open the familiar eboston05_lab2start.mxd document and add the new shapefiles since they are included elsewhere in our data locker. Use ArcCatalog to copy the additional shapefiles listed below from the class data locker into your local folder (C:TEMPLAB5).
- Z:athena.mit.educourse1111.521datama_towns00.shp (Massachusetts Towns Boundaries)
- Z:athena.mit.educourse1111.521datamsa_water.shp Boston (MSA Water Boundaries - to enable coloring the water blue and hiding census blockgroup boundaries)
- Z:athena.mit.educourse1111.521dataoston_bg90.shp (1990 Boston Census Block Groups. 1990 is the middle of the sales data period.)
- Z:athena.mit.educourse1111.521dataosblocks05lockmap05.shp (2005 outline of Boston Blocks)
Next, open ArcMap by double-clicking on your local copy of eboston05_lab2start.mxd in the c: emplab5eboston05 folder. Add to ArcMap your local copy of the additional shapefiles listed above and then prune the shapefile list and rearrange their order as listedbelow: (Just check to be sure that your Data Frame is set to Mass State Plane (mainland) NAD83 meters coordinates. Note, also, that we include a second 1996 shapefile for East Boston parcels instead of just the ebos_parcels05 version that we have already used.)
/****** end of alternative ArcMap startup steps ******/
If necessary, rearrange the ordering of layers in ArcMap as follows:
- msa_water.shp (or use MassGIS or ESRI web services from previous labs to improve context and visualization)
- ebos_parcels05 (the 2005 parcel map for East Boston that we have been using in previous labs)
- ebos_parcels96 (the 1996 parcel map for East Boston, in the 'eboston05' folder but not used in previous labs)
- blockmap05 (a block-level map of Boston in the 'bosblocks05' subdirectory of the class data locker that was obtained by 'dissolving' 2005 parcels to blocks)
- ma_towns00 (you can add your local copy or continue to point to the network version in the class locker)
To get started with a useful visualization of Boston, we suggest the following setup: Adjust (if necessary) the colors and transparency of the layers so that the ocean is blue with no border, and the Planning District boundaries for Boston are visible and other shapefiles are turned off.
MAP #1: Create a thematic map of median housing value by shading the boston_bg90 block group theme of 1990 census data based on the 1990 medhhval values. Be sure to exclude block groups with medhhval = 0 and use the default 'natural breaks' classification.Since many block groups had too small a sample of housing that is owned rather than rented, many block groups get excluded from your thematic map.
To reduce clutter and avoid coordinate system conflicts let's add a new Data Frame that we will use for our raster operations. Click on Insert/Data_Frame and rename the new data frame 'East Boston Raster'. In the next section, we will divide East Boston into 50 meter grid cells. The boundary of East Boston that is included in bos_planning_districts will be the most convenient shapefile that we have to provide a bounding box for East Boston. However, that shapefile is saved on disk in NAD83 Mass State Plane coordinates (feet) rather than in NAD83 Mass State Plane coordinates (meters). Beware of the coordinate system of your Data Frame and your data layers: If you started your ArcMap session using eboston05_lab2start.mxd or boston96_lab5start.mxd, then the data frame will be set to Mass State Plane (NAD83-meters). However, if you started with a blank ArcMap session and then first added the East Boston parcels (or planning districts), then the coordinate system of the parcel shapefile (which is Mass State Plane NAD83-feet) will be adopted by the data frame. Then, when you set a grid cell size of '50' the units will be 50 feet not 50 meters. We suggest that you use the bos_planning_districts shapefile for the 'mask' that isolates the East Boston area for your raster grid. But this shapefile is saved on disk in Mass State Plane NAD83-feet. Convert it to Mass State Plane NAD83-meters on disk and use that layer to build your East Boston 50x50m grid cells. When you do spatial operations (like converting vector to raster or spatial joins) you will find ArcGIS to be more reliable if each data layer is saved on disk in the same coordinate system. To save a new copy of bos_planning_districts in NAD83 Mass State Plane (meters), right click on the layer, choose Data/Export-data and then be sure to use the same coordinate system as 'the data frame'. For the storage location, browse to C: emplab5, name it bos_districts_meters and save as file type: shapefile. (If you are familiar with geodatabases, you can save it there but you might have a harder time finding it again later on.) Allow this new shapefile to be added to your ArcMap session and then copy and paste it into your new 'East Boston Raster' data frame. Then, copy and paste the 'ESRI_World_Shaded_Relief' and 'Mass Data from MassGIS GeoServer' layers into the new data frame (for visualization purposes) and you are ready to construct your 50 meter grid cells and begin the raster analysis.
Agriculture Land Suitability Evaluator (ALSE): A decision and planning support tool for tropical and subtropical crops
Agricultural land suitability evaluation for crop production is a process that requires specialized geo-environmental information and the expertise of a computer scientist to analyze and interpret the information. This paper presents ALSE, an intelligent system for assessing land suitability for different types of crops in tropical and subtropical regions (e.g. mango, banana, papaya, citrus, and guava) based on geo-environmental factors that automates the process of evaluation and illustrates the results on an attribute table. Its main features include support of GIS capabilities on the digital map of an area with the FAO-SYS framework model with some necessary modifications to suit the local environmental conditions for land evaluation, and the support of expert knowledge through on spatial tools to derive criteria weights with their relative importance. A dynamic program for calculation of eigenvalues and eigenvectors of a weighting matrix is provided. Expertise and knowledge help ensure that ALSE databases represent realistic, practicable and functional systems. It is useful for decision makers to determine the quality of land for agricultural uses and is intended as a decision and planning support. Responsibility for any decisions based partly or wholly on the output of ALSE rests with the decision maker. ALSE ensures that the results are interpreted correctly within the relevant context, and contributes by maximizing land-use planning and decision support.
► ALSE as an intelligent system for land evaluation using the FAO-SYS model and management expertise. ► GIS–MCE support systems that assist decision-makers on land suitability evaluation. ► ALSE as a useful system for land planners to make complex decisions within a short period.
Starting Atmospheric Correction
- From the Toolbox, select THOR > THOR Atmospheric Correction. The Atmospheric Correction dialog appears.
Note: Atmospheric Correction is also available as a panel in many workflows, in addition to being available as a stand-alone tool.
THOR extracts a rough estimate of the image’s mean spectrum and compares it to a generalized radiance spectrum using Spectral Angle Mapper (SAM) to determine whether the input image is radiance or reflectance. If THOR determines that the input is reflectance, it automatically selects None / Already corrected as the correction method and warns you if you try to apply a correction method. However, the initial screening is only an estimate, so you can still proceed with an atmospheric correction method if needed.
Thursday, October 1, 2009
Calculate Lat/Long Values in ArcMap
Yet again, I had a difficult time finding sufficient information on calculating latitude and longitude values for a point shapefile. I've only encountered one or two situations in the past few years where I worked with a coverage that was missing these data, though it's still a basic and important technique that should be addressed a bit better.
It turns out that (at least) ArcMap 9.2 provides a pretty simple method to quickly calculate a number of geographic coordinates. A few things are important to consider though:
- Show ArcToolbox in ArcMap (or ArcCatalog)
- Open the Feature to Point tool
(Under Data Management Tools :: Features :: Feature to Point)
- Press Show Help to read more about the tool
- Select a polygon or line feature to use as in input featre
- Set an output location and name for the resulting feature class
- Check the "Inside" check box to calculate a centroid within the boundary of a given feature (i.e. a "bent" polygon similar to the shape of Florida can have a centroid in the Gulf of Mexico if "Inside" is not selected)
There is insufficient information on the internets about this process works. There is a simple and automated calculator built into atribute table field calculations. To make this work properly in this situation (calculating geographic latitude/longitude coordinates) the coordinate system must be set to a geographic coordinate system, rather than a projected coordinate system.