Tuesday, August 21, 2012
Network Analysis
Network-based spatial analysis is possible through ArcGIS Network Analyst. Essential functions include routing, fleet routing, travel directions, closest facility, service area, and allocation and displacement of places. This analytical tool allows dynamically modeling real-world network connections. One-way streets, U-turns, height restrictions, speed limit, and traffic can be adjusted. The shortest routes can be achieved as well as the most efficient routes for visiting multiple locations. The closest place of interest can be located. Location-allocation analysis can be conducted to determine the optimal locations for facilities.
This lab involves network analysis focused on my mini vacation trip over the 3rd weekend of May 2012. The trip was planned with a hotel booked and activities organized. It was a 2 night trip staying at the Cottage Inn by the Sea at Pismo Beach. This vacation was especially memorable because the hotel was situated in front of the beach overlooking the ocean. A fun activity planned was ATV riding. Another exciting activity was shopping at Higuera Street in San Luis Obispo. Lastly, visiting the Hearst Castle was a must since it is a famous attraction in San Luis Obispo. The optimal route was determined by creating new routes. The route includes starting at my apartment at 22027 S. Vermont Ave in Torrance, stopping at the Pismo Beach Cottage Inn by the Sea hotel at 2351 Price Street in Pismo Beach, driving over to Higuera Street in San Luis Obispo, and finally ending at the Hearst Castle at 100 Hearst Castle Rd in San Simeon. Addresses of these locations were found online and geocoded in ArcGIS to improve accuracy of the exact location before setting the route location points. Freeways and major streets were added as layers as well. The points of interest are labeled. After setting the points of interest, clicking the solve button in Network analyst tool performed the optimal route. The impedance was set to time and so it generated the fastest route based on speed limits. Distance unit was set to miles.
Restrictions were
made for optimal route generation avoiding traffic accidents. An accident was
proposed on the 405 N freeway near the Crenshaw Blvd. exit. This traffic
collision made the driver take 110 N straight up, passing the 405 N merging
ramp, and take the 105 W freeway. From the 105 W freeway, the driver would be
instructed to take the 405 N freeway which completely avoids the collision. Another
traffic collision was created on the 101 N freeway near Thousand Oaks and
Westlake Blvd exit. This collision changed the route slightly by making the
driver exit and returning to the 101 N freeway after passing the accident
locally. Service area was also conducted to determine whether the locations of
interest served where I live. The parameter values for time in minutes included
10, 15, 20, 25, 30, 35, 40, 45, and 50. The search tolerance was set at 500
meters. The service area was generated similarly to optimal route with default
parameters. U-turns were allowed everywhere and the restrictions were one-way.
The closest route
possible for ATV rental stores was generated from the Pismo Beach Cottage Inn
by the Sea hotel. A total of 6 location points were made including Arnie’s ATv
Rentals, Steve’s ATv Rentals, BJ’s ATV Rentals, Angello’s ATV Inc., Sun Buggy,
and Grover Beach Motor Sports. The result proved that the closest store was
BJ’s ATV Rentals. This analysis was done after my weekend trip. I rented ATVs
at Angello’s ATV Inc for the affordable price compared to other rental shops.
However, if I knew that more affordable ATVs are mostly for kids and the closer
ATV rental shop was BJ’s, I would have saved time and gas.
Based on the
results, the fastest route is understandable and realistic. The network
analysis tool can be improved by having the ability to read traffic and analyze
this information in addition to speed limits. This analysis was especially
practical because the trip I took was long distance and away from Los Angeles
traffic. The network analysis would be more reliable in Los Angeles in hours of
less traffic. Adding this traffic analysis function to network analysis will
improve the accuracy of optimal routes.
Watershed Analysis
Watershed analysis is an essential device in GIS that performs DEM and raster data operations to delineate watersheds and define stream networks and basins. It is especially applicable in hydrology and water resources. The amount of watersheds produced depends on the spatial scale. More watersheds would be produced at a smaller scale. The thresholds chosen for the stream network and the quality of the DEM strongly influences the outcome of the watershed analysis. Also, algorithm for deriving flow direction plays a role in influencing the analysis. The objective of this lab assignment was to perform 2 components (basin and stream network), and compare this watershed analysis to a product of a different source. The area of focus was the Tibetan Plateau, which locates the world’s biggest group of high-altitude endorheic lakes.
The
methods necessary to perform this lab correctly involves close attention to
detail, since the steps are easy but a mistake would ruin the whole process.
Spatial Analyst extension must be checked before proceeding with the tools. The
Hydrology tool is located in Spatial Analyst under Arctoolbox. The first step
is the make the fill, but this was already provided for student convenience,
and was easily accessible from the class website. The second step is to make
the flow direction raster with the fill DEM as the input. Then, generate the
basin by using flow direction as the input. After this last step, the first
component of the lab can be completed by converting the basin into a polygon
using the raster to polygon tool. Stream network is a little more complicating
than creating the basin. The output will show small individual stream sections.
Flow accumulation raster tool is used under Hydrology, and the fill from
previous steps is the input weight while the flow direction is the next input.
Both is needed to create the flow accumulation raster. Float is kept as
default. The flow accumulation raster is reclassified using the reclassify
tool. It is reclassified into a stream raster file. The number of classes is
changed to 3 and the method is changed to manual (in that order). The break
values are changed also to 0, 500 (threshold), and 9999999999999999 (until the
end). The first two new values are changed to “NoData”. The next step is to make
the stream order. The stream order tool is under Hydrology. The reclassified
raster is used for the input stream raster and the flow direction raster is
used for the other input. Also, the method of stream ordering was changed to
shreve. The stream files can then be vectorized into stream lines using the
stream to feature tool. The symbology needs to be changed to show more clarity.
Change the symbology to graduated colors and set the value to GRID_CODE and
make it 5 classes. The colors for each class can be adjusted. It is important
to consider moving the lake layer to the top of the contents. Now, other
sources can be downloaded and compared with the finished stream network and
basin. It could be a good idea to put the basin layer on top of the other source
to see the difference more clearly. For this lab exercise, data for Asia was
downloaded from Global Drainage Basin Database.
The
watershed map created had an extensive amount of stream networks and water
basins. Most of the streams have a stream order of 1 (green). This indicates
that the water is flowing from a primary source. Therefore, more water can be
collected. The streams are definitely contributors to the formation of lakes. Streams
appear to be very detailed and lakes are clearly evident. The watershed map
that was created has more detail in stream network when comparing maps. It even
shows more accuracy. The data downloaded from the Global Drainage Basin
Database has misleading data. There is empty data where it is supposed to be
partly or wholly covered by lakes. This white area is not counted as basin.
This result is probably due to the fact that the data was for the entire Asia.
It does not show much detail for the area of study. The lakes are shown as
points and there are just a few. On the other hand, the advantage of my map is
the detail in stream networks and lakes. The basins for both maps do not match.
These differences are likely due to conflictions on method type. My map was
area-based while the downloaded data was pour-point. Perhaps, there was a
z-limit difference during the process of the watershed analysis. Z factor was a
huge issue when working with ArcGIS 10. Another important note to consider is
that there shouldn’t be only one basin if there are parallel lakes. These discrepancies
can be seen when closely comparing the basins for the two maps. For example,
the big basin for my map at the bottom right corner contains several lakes.
This is a reasonable error to catch when comparing it with other sources. In
this case, the downloaded data has a more accurate map of basins. It also seems
that the largest basins are produced in areas of the largest stream order. This
is not the case for the downloaded data. Basins for this data are separated
while my map has basins all in one big piece. Nevertheless, stream network and
the number of lakes my map shows is significant when compared to the downloaded
data. The downloaded data only shows 3 lakes, which is insufficient to provide
a truly realistic watershed analysis.
The
fill is an important step for the rest of the analysis to work properly. The
step became an issue from the start and was resolved by the TA. Which z value
was supposed to be used was one of the major questions regarding this issue.
There was no problem in ArcGIS 9 but ArcGIS 10 seemed to not tolerate any z
value errors. Further problems occurred with reclassification. No instructions
were given about how we should input values for break values or new or old
values. It took some trial and error to figure out which numbers fit best. And,
of course there were unexplainable errors that occurred. Usually, these unknown
errors are solved by redoing the procedure. Downloading the right DEM is
important in working with watershed analysis. The resolution and quality is
especially important. There were some complications with USGS data sources. It
did not seem to work properly when performing fill or any other tools after
that. It produced a blank output with stars (bright spots). So, again, this
might have been another mystery of the fill problem.
Nevertheless,
watershed remains an important topic in GIS. It requires the integration of
knowledge and data. It is important in solving hydrological problems. The
effects of watershed analysis can be very helpful. For instance, it can prevent
vegetation and fertile soil from getting depleted by proper management of
watersheds in a sustainable manner. Therefore, watershed analysis is crucial
for environmental protection. And comparing personally created maps with other
sources especially helps in explaining the differences of the best possible
information available for watershed purposes. In this lab, both maps were
needed for proper analysis. Downloaded data had more accurate basins, while the
created map had more detailed drainage networks and lakes.
GPS Data Collection and Image Georeferencing
Georeferencing is an important tool that involves applying real world coordinates onto a map, air photo or digital imagery. Raster data is commonly obtained by scanning maps or collecting air photos and satellite imagery. Scanned map datasets lack spatial reference information. Locational information for air photos and satellite imagery is inadequate with data aligning incorrectly with other data in GIS. Georeferencing allows the aligning of raster datasets in conjunction with other spatial data to a map coordinate system. The raster data can then be viewed, queried, and analyzed with other geographic data in GIS.
One method in obtaining real world
coordinates is measuring the coordinates based on satellite signals through GPS
technology. The GPS system has been designed to provide information as accurate
as possible. However, there are still errors that cause deviations from the
actual GPS receiver position. Aside from practicing the importance of
georeferencing, the purpose of this exercise was to evaluate uncertainties in
order to minimize their effects and allow for improved accuracy.
Signal
clarity and reception is not entirely accurate when utilizing GPS. For example,
using GPS indoors can cause weak signals. The signal hits and is reflected off
objects such as buildings and rocks. The GPS device can fail to give proper
coordinates when situated in a low reception area. The built in clock of the
GPS can cause errors as well when it is not as accurate as the atomic clock of
the satellites. This timing error produces errors in calculations. Users of GPS
can also generate errors. GPS users can make mistakes recording the exact
location of where the coordinates were taken. Identifying a series of ground
control points (GCP) links locations on a raster with that of the existing
spatial referenced data. These control points can be accurately identified on
raster datasets and real world coordinates. The GCPs ultimately minimize the
errors associated with human induced mistakes. There will also be errors when
matching the locations of the GCPs with GPS coordinates. These errors occur
because it is rare and unlikely for the exact location on the map to be
located. GCP distribution is important in the accuracy of the geometric
correction process. It depends on the number of selected GCPs. The GCPs should
be evenly distributed and have enough surrounding a certain section. For
instance, section 1 of the UCLA campus can have 2 GCPs on the top and none at
the bottom. Section 2, which is right underneath Section 1, can have 2 GCPs on
top to make up for the missing GCPs on the bottom of Section 1. A bad location
and distribution of GCPs cause an increase in the average RMS error value of
correction.
This lab assignment involved groups
of four or five going out into the field and collecting points of interest
throughout campus. Errors could have been caused by some groups not recording
the correct coordinate information. Out of all these points collected, 16
points were chosen for georeferencing. A separate excel sheet was created with
only these 16 points and then added as x and y data on ArcMap and defined as
UTM Zone 11. The points were exported in a shapefile and fitted to display with
the UCLA JPEG image. After completing the georeferencing process for all
points, residuals and total RMS error was saved. Lastly, the image was
rectified as a TIFF and the points were shown on the map as symbols
representing residuals. The street layer
was also added and clipped with the rectified image.
Results indicate residuals that show
the difference between the original or actual point specified and where the
point ended up. It also generated a RMS (root, mean, square) error, which is
the total error computed by taking the RMS sum of all residuals. The RMS error
indicates the consistency of the transformation between the different control
points. Point 6 was removed to reduce the RMS error. However, keep in mind that
a poorly entered control point can be the cause of drastic errors even when the
RMS error appears to be low. The RMS error recorded, as shown on the map, was
4.77714. The RMS value is high assuming that the groups made mistakes and GPS
also had inaccuracies. If there was more time to collect GCPs, the result of
RMS could have been lower by allowing each group work with their own GCPs
across the campus. Georeferencing is a valuable tool nonetheless. GPS
technology should be constantly upgraded and improved to achieve greater
accuracies, provided that there are ways to reduce errors.
Suitability Analysis
UCLA has proven to be a top university continuing to attract
applicants all around the world. As a matter of fact, it has been known as the
most applied-to university in the nation. UCLA operates on a global level, with
diversity surrounded by one of the most recognized cities in the world, Los
Angeles. UCLA has gained reputation of excellence and prosperity. Most
importantly, UCLA makes a difference in the world and students, along with
staff members, are a huge part of it. The performance of UCLA in academics,
research, and service is dedicated to maintaining the purpose of creativity,
perseverance, and achieving great accomplishments and discoveries. If building
an extra UCLA campus and naming it UCLA2 sounds exciting, it is even more
interesting to learn about what is being done here, now, at UCLA department of
Geography, focusing on Geospatial Information Systems (GIS) and Technology. Suitability
analysis is conducted to analyze the best possible location for a new UCLA2
campus. Several parameters are being considered including, but not limited to,
the influence of slope, urban areas, burglary incidents, and distance from
major highways. The new campus is supposed to serve approximately 5,000
additional students and staff. The UC Regents and LA County have put out a
Request for Proposals to locate the best place for building a new campus. The
only criterion given for the project was that the campus must be built in Los
Angeles County. The capabilities of GIS are powerful and technical enough to
effectively support the decision to establish a satellite campus, UCLA2.
The
procedure of this exercise is detailed, but at the same time repetitive and
simple with sufficient practice and understanding of the material being
performed. This analysis involved working with the projection of UTM Zone 11,
except for crime data which was projected to State Plane 5 format (California
V). Needed data was downloaded from UCLA GIS Mapshare and Los Angeles County
GIS Data Portal. Also, a mosaicked DEM from a past lab assignment was needed to
generate a slope map. There are a total of four parameters that were converted
into a raster (slope is already in raster form) and reclassified accordingly
based on suitability. At the end, a final map of suitability was calculated and
boxes were drawn to indicate the best possible locations for UCLA2.
Slope
is given the most weight because construction of a new campus on a steep slope
is not only dangerous, but impractical. A mosaic DEM from a previous lab
assignment was utilized to generate a slope map. Reclassification was necessary
to indicate the steepest slope as the lowest number and the most gradual slope
as the highest number. A higher number indicates higher suitability for
construction. A total of 5 classes with equal intervals were assigned for
reclassification. After reclassification, the slope map was clipped with
extract by mask tool with the LA County layer. This reclassified raster is now
ready for the final calculation of suitability analysis. An important note to
consider is that the slope map will determine elevation and this can also
affect the dangers associated with earthquake damages. Higher elevation means
more earthquake damage. In turn, construction on a lower elevation location is
safer and reliable.
Highways
are given the second highest weight because transportation is important in
traveling from and to campus both for students and staff. UCLA is actually built
right by the 405 freeway and is very convenient especially during heavy traffic
hours. Therefore, highways are essential in building UCLA2. The data for
highways was obtained from UCLA GIS Mapshare. The data is major highways of Los
Angeles County. The shapefile was then buffered with the multiple buffer tool.
Buffers of 1 and 3 miles were assigned. The buffers were then converted into a
raster and reclassified. Realistically thinking, 1 mile from the highway is too
close for any construction so 3 mile buffer was given the most emphasis on
suitability. The area beyond the 3 mile buffer was given the second highest
suitability score. It is thus important to remember that the “No Data” value
for reclassifying should be given a number. This number would be the second
highest suitability and is shown with the color blue on the map. Green
indicates the most suitable. This reclassified raster is now ready for final
calculation.
Burglary
was given the next highest weight because burglary is actually the number 1
committed crime around university campuses. Students are desperate to steal
laptops and even cars. Libraries are constantly informative on warning theft
and not leaving valuable unattended. A stolen laptop or folder can cause
extreme loss of academic achievement and personal possessions. Therefore,
burglary is unacceptable under any circumstances and should occur away from
campuses where students and staff deserve peace and quiet. Crime data was
downloaded from the Los Angeles County GIS Portal. Part 1 was downloaded which
includes all crimes committed within the last 30 days, as stated by the FBI. The
file was saved as an excel file as filtered for burglary only. The filtered
results were then saved again as an excel file (old version). The X and Y columns
were separated by Text to Columns in the Data tab and checking the comma
delimiter. The X and Y were then added to ArcMap by adding X and Y data. This
data had to be projected to State Plane 5 to match the Los Angeles County
layer. Zooming into LA County layer makes the points visible. The points were
then given multiple dissolved buffers of 1, 2, and 3 miles. The procedure after
this was then similar to major highways, but instead the No Data value was
given the highest suitability, with descending suitability scores as buffers
approach the points. The closest buffer to the crime point is the least
suitable and is indicated in purple on the map. The most suitable locations are
colored in red. At least 3 miles should be given from theft incidents. Of course,
the points were converted to raster before reclassification and are now ready
for the final suitability analysis.
Urban areas were given the least weight out of
all the other factors because it is not absolutely necessary to build UCLA2 on
an urban area. Data for urban areas was obtained from UCLA Mapshare. The data
was in polygon format. One single 10 mile buffer was created using the buffer
tool. The 10 mile dissolved buffers are shown in pink on the map and the rest
are areas not close to or near an urban area and are shown as worst locations.
Urban areas are emphasized as the best locations. The buffers were converted
into raster and reclassified. Again, the No Data value was given a lower
number. These areas are nonurban. All the reclassified and masked rasters are
now ready for final suitability analysis.
Results
are calculated with raster calculator by adding all the reclassified and masked
raster layers. A final suitability layer is created. The symbology color ramp
was changed and given the maximum number of classes. Blue indicates the least
suitable areas while red indicates the highest suitability. Reasonable
locations for the new satellite campus UCLA2 is marked by rectangular boxes in
black. One of the most suitable locations is right above UCLA and is underneath
the 118 freeways, above the 101 freeways, and east of 405 freeways. This is
actually a huge area of consideration and can be utilized to its full potential
nearby freeways. Another suitable location is underneath the intersection of
the 138 and 14 freeways. It is in Lancaster. The last suitable place for UCLA2
is also nearby major freeways and is located right above Angeles National
Forest.
Issues
associated with this lab assignment were ongoing and complicated. Issues were
especially difficult to solve when working with the crime data. Crime point
data was big enough to freeze the computer and crash ArcMap. Buffers also took
a longer time than expected. Smaller buffers were then created as an
alternative route. Reclassifying also took a lot of trial and error. The most
important part to remember is to give a value for No Data when making an
argument outside the buffers. If a value is not given pre raster calculation,
the areas outside the buffers are going to be completely ignored sice it does
not interact with other reclassified raster buffers. Therefore, every single
space within Los Angeles County boundary must be given a value. Also, if
realistically thinking, there are hundreds of parameters that might influence
the construction of UCLA2. It is important to note that this is just a proposal
of the best possible location of UCLA2. Further research should be done for
other parameters that might affect the best suitable location for UCLA2. But
overall, this lab assignment ended and completely successfully with time
strenuous effort and redoing steps over and over again. In the end, the final
suitability was worth the hardships because UCLA2, if built, will not only be
safe from burglary, but it will also be conveniently located nearby highways
and urban areas. Nevertheless, this UCLA2 campus will be safe from earthquake
damage and will have more than enough space to fit 5,000 additional students
and staff, even with built in gyms, stores, and sporting fields.
Viewshed Analysis of Cell Towers
Cellular signal strength in any location in Los Angeles County is important in daily life for modern day society to progressively make connections with each other. The operation of cell phones is dependent on cell tower location, height and power. Conducting viewshed analysis on cell tower coverage for Los Angeles County allows further determination of which dependency best enhances cell tower performance and produces higher population receiving signal. The original viewshed is shown, along with re-conducted viewsheds with unique added options to analyze the differences when compared percentages between original and new.
The DEM for Los Angeles County and
surrounding areas was downloaded from the USGS Seamless website. Two separate
DEMs had to be connected by mosaic. This output mosaic DEM projection was then
converted to WGS 1984 UTM Zone 11. The cell tower layer projection was kept the
same. Instead, new attribute fields were added: OFFSETA, OFFSETB, AZIMUTH1,
AZIMUTH2,
Given
$30,000 to determine how cell tower performance can be improved, several steps
were taken. One, 3 new towers were added using the Editor tool to locations of
lowest cell reception. Two, the height of all cell towers were raised by 10
meters. And three, the power of each tower’s range was raised by 5,000 meters. Before
conducting the second option viewshed analysis, the newly added points were
removed and data values for OFFSETA were changed. Similarly, before conducting
the third option, OFFSETA values were changed back to its original values and
RADIUS2 was changed. Lastly, extract by mask was conducted for each. Percentage
calculation of how much reception is available to people in Los Angeles County
proves that cell tower power is important. Increasing the radius by 5,000
meters generated a percentage of 62% of the population receiving signal. It is
only a 3% improvement from the original viewshed analysis, but it proves to be
the best solution for increased reception. Adding 3 extra cell towers at
optimal locations generated a percentage of 59.9% and raising the height of
each tower resulted in a percentage of 61%. The algorithm of the calculation is
to take the sum of the total number of people receiving reception from the
attribute table of the extract by mask layer. Then, dividing this value by the
total population in Los Angeles County and multiplying this resulting value by
100 to obtain a percentage. Increasing power is recommended. Therefore, further
power increasing is suggested, perhaps increasing higher than just 5,000 meters
more.
DEMs and Terrain Modeling
The purpose of this particular lab assignment is to
develop an improved understanding of performing terrain analysis on vegetation
types relevant to topography. The specific area of interest is the Santa Monica
Mountains (SMM). After the detailed procedure of calculating slope, aspect, and
solar radiation of the DEM obtained from USGS, a complex calculation of average
annual insolation and mean values are calculated, then compared with vegetation
types and graphed.
First,
the DEM of a part of SMM is needed to be downloaded through the seamless USGS
website. A 30 meter resolution DEM must be downloaded and opened in ArcMap.
Before moving on to the next step, the metadata of the DEM must be clipped with
the vegetation data provided by the instructor. This clipped metadata
identifies the vegetation types within the chosen DEM. It is also important to
change the projection for the vegetation layer and DEM to UTM Zone 11N. Now,
calculating slope and aspect is easy. Make sure Spatial Analyst is checked
under Extensions. Navigate to Spatial Analyst Tools in the ArcToolbox and open
slope or aspect under Surface. The input raster should be the projected UTM
Zone 11N DEM. Save the output raster to the appropriate drive and click OK. The
aspect and slope map should be generated in less than a minute.
The next
step is to calculate the solar angles by using the Hillshade tool. The
Hillshade tool is also under Surface. Unlike previous introductory labs, this
hillshade calculation requires the azimuth and altitude values for each season.
Google Equinox and compute the solar angles from Sustainable by Design website.
There should be no daylight saving, elevation is 0, and the time should be set
to 12:00 PM. Zero azimuth is South and input the dates according to the Equinox
and season. The longitude and latitude is found from the DEM properties from
ArcMap. If azimuth is a negative value, just add 360 to the value. Type in the
azimuth and altitude values in the hillshade tool with input raster to the
projected DEM. Z factor should be left to 0 and run results. Repeat these steps
for each season. There should be a total of four hillshade layers.
The solar radiation algorithm is provided in
the last slide of the lecture slides. The equation is I = S * Hillshade/255.
The "I" stands for insolation and S equates to 1000. Plug in the
hillshade calculated above and compute the insolation for each season using the
raster calculator. These four insolation maps should be submitted as part of
the final layout. And it is evident that the four insolation maps correspond to
the calculations as shown on the layout. Insolation maps provide essential
information on the amount of solar radiation each season receives. The season
with strongest solar radiation exposure is summer. The darker areas indicate
less solar insolation exposure and the lighter areas receive more solar
radiation. Of course, the lightest areas receive the most solar radiation. It
is indicated in the Legend also the range of solar radiation values from low
(dark) to high (pure white).
In order
to show a clearer perspective of the importance of vegetation types related to
slope, aspect, and solar radiation, tables or graphs can be generated. For this
lab, graphs showing solar radiation regime vs. season, slope mean vs. vegetation
type, aspect mean vs. vegetation type, and elevation mean vs. vegetation type
was created. It is recommended to pay very close attention to the following
steps for calculating mean. The mean values are calculated using the Zonal
Statistics as Table tool, located under Zonal in Spatial Analyst tools. For
slope and aspect mean values, the input raster should be the clipped metadata
and zone field should be changed to WHRNAME (vegetation names). The input value
raster should be either slope or aspect depending on what is being calculated.
Save into the appropriate drive and change the Statistics type to mean. Repeat
these steps for input value rasters spring, summer, fall, and winter
insolation. Once the mean table is calculated, open the attribute table for
each season and find the sum mean under statistics. The mean sum for each
season should be inputted into Excel and saved. Open the Excel spreadsheet by
adding it as data. This table is used for creating a graph that shows annual
solar radiation mean for each season. Lastly, the mean should also be
calculated for elevation. The projected UTM Zone 11N DEM is used to find the
mean elevation. The graphs are simply created by opening the tables and
clicking on create graph under table options on the top left corner. An issue
of exporting the finished map layout as a PDF is that the y-axis titles appear
as rotated horizontally. This is fixed by deleting the title from the create
graph tool, and manually type the text with the draw tool.
It is
clear that chamise redshank chaparral has the highest elevation and slope mean
values compared to the other vegetation types. Coastal scrub is seen to have
the second highest elevation and slope mean values. These mean values can be
referred back to the slope map. The higher the elevation, the more solar
radiation it is possibly receiving. However, this is not always true for slope.
When determining how much solar radiation an area is receiving according to the
slope map, the aspect has to be considered. The closely surrounding low areas
near the high elevations are receiving the least incoming solar radiation
because of shadows. Summer is proven to receive the most solar radiation out of
all the other seasons, which seems very obvious. This result is evident in the
graph as well. The lighter the gray shade in the hillshade maps, the higher the
solar radiation. The aspect map refers to the directional measure or compass
direction of the slope and indicates that north facing slopes are more likely
to be exposed by solar radiation. The aspect mean graph hence indicates that
chamise redshank chaparral has the lowest mean aspect value. Therefore, this
vegetation type includes the most vegetation facing north. There are no flat
surfaces in this DEM. Therefore, in the case of an emergency plane landing,
there is no particularly safe place to land. It was interesting to see that the
solar radiation mean for spring and autumn were almost identical. Spring was
just slightly higher. Winter obviously had the lowest solar radiation exposure.
All of these results refer to one day of each season. The spatial variability
of insolation changes with day and time of year. Nevertheless, studying solar
radiation strongly helps determining the effects of solar radiation on many
biological and physical processes.
Importance of Wireless Cell Tower Locations
The revolutions of mobile phone technology enhancements continue to irrevocably change interaction and functionality of modern society. People are heavily dependent on cellular network connections on a daily basis. Without cell phones, there would be a sense of disconnectedness from the world. This growing phenomenon is established partly by cell phone towers that allow cell phones to effectively connect with reception. Reliable reception is crucial in extreme circumstances such as emergencies. For instance, a child might be lost and the only method of locating the missing child is through GPS technology equipped in a cellar device. An elderly living in the top floor of a tall building in Los Angeles needs high reception service to quickly push the red button on her phone when struggling from a deadly heart attack. Therefore, awareness and knowledge of cell phone tower locations is important in verifying strong cellar connection strength.
When
considering the purpose of cell phone towers, it makes sense to assume that
locations are determined based on certain factors including population density
and areas of interest in need of signal strength. The map I created shows a
clear presentation of four counties in total (Los Angeles, Orange, Ventura, San
Diego) and their respective cell tower locations. It is important to note that
FCC does not require every single tower to be registered on their site. Thus,
the map might not be a complete indication of all towers within the study
areas. However, the focus of this map is to answer the critical question on why
cell phone tower locations are important. The map proves that cell phone towers
are located mostly near major highways. Also, more wealthy people tend to live
near the beaches and so more cell phone towers are located towards the coast.
It is a fun fact that Catalina Island has only one cell phone tower (it is also
near Avalon). It is obvious that cell phone towers are built at specific
locations for a reason. There are less cell phone towers in major parks since
these places are usually not heavily populated. Cell towers that are located
within the parks prove that it might be common places for play and recreation.
Cell phone tower locations are important in determining populated areas. In
addition, locating cell towers become increasingly important when sympathizing
for people complaining about health hazards and property value
depreciation. Towers are tall structures
that can block the view of certain properties. This is negatively suffered by some
businesses. Cell towers are also constantly a topic of health issues in that it
causes illnesses such as cancer. Possible dangers might include exposure to
radiation. Furthermore, some people assume that radiation is not harmful
because the towers are too far. This is not the case for some areas as evident
in the map. Some towers are even camouflaged by disguising them as trees.
Regardless of these troubling issues on cell
phone locations, cell phones have been too important to eliminate completely.
Further research can be done on the question as to whether airports are an
influence of less cell phone towers. It might be assumed that there would be
quite a lot of towers near airports, considering the high numbers of people
moving in and out having to call one another on arrival and departure. The map
proves otherwise and has no cell towers too close to the airports.
Nevertheless, cell phone tower locations can provide insight on the spatial
distribution of populations and predications on new and emerging urban cities
that will most likely be populated by people with cell phones.
Subscribe to:
Comments (Atom)