The World Bank recently shared an exemplary, practical application of big data for improving Seul’s transportation ops. Without complex algorithms (of machine learning,) Koreans intelligently leveraged cellular data and the associated user information to predict demand routes. As the report goes “the departing and arriving points of travelers may be the most essential data in structuring bus routes.” It shouldn’t be too difficult to infer how major implications has on designing transportation network in big cities. The full article can be found here: https://twitter.com/WBPubs/status/593268540125421568
I spend a good chunk of 2014 thinking about (and being frustrated by) change. My musings were centered on the question: Why people resist a lucrative change initiative? (e.g. you will work less and still produce more.)
In my previous post (Ingredients of Change) I attempted to make an exhaustive list of factors that encompass a change initiative. However, the crux of change, or the most difficult part of change, is convincing (to the heart) and motivating people to submit to your initiative. As I recently discovered, that process is all about changing the perception. As Luc de Brabandere calls it, its to change twice. Forgetting to chance the perception will result in failure.
Change of Perception is a discontinuous process
The word “perception” derives from latin roots per (entirely) + capere (take.) I think the ethnology of the word beautifully implies a precise meaning. You either get it or not, its IN or OUT, BLACK or WHITE (no gray). Now Brabandere’s definition makes a perfect sense: perception is not a continuos process. Its a discontinuous process. It happens all of a sudden or it doesn’t happen at all an all your work fails.
Now I feel one step closer to understanding change mgmt. Look at some failed M&A cases. In 2005 eBay bought Skype. As rumor has it, both companies never aligned their perceptions on how each company work, and the M&A did not work out. But in 2011 Microsoft successfully bought the company and has been operating it happily since then.
Accomplishing a successful change requires three things: knowledge, will (volition,) and power.
Every single step in the process of change falls under one of these three categories.
- The actual plan of the change
- Benefits of the proposed change
- Defined expectations (success metrics)
- Will (volition)
- Motivating process-owners
- Making process-owners see the value
- Executive initiative
- Process owner capabilities for the proposed state.
The sub bulletpoints may be extended. But the three categories are exclusive and exhaustive enough that they will remain same for all types of projects.
From Pole to Pole is the first episode of Planet Earth documentary series. Aired in 2006.
I believe observing a natural process (such as the supply chain in ecology) can be very motional and can teach more valid lessons than academics. By no means, I disrespect. However I think the lessons that academics teach are incomplete (although its our best option for knowledge,) whereas our planet is complete and represents a perfect system. My thought on this is likely to remain stable until someone can propose and design an alternative solar system. You will really feel this if you watch the documentary.
It costs around $3 at Amazon. I highly suggest taking some time to watch this documentary for fun and also drawing some lessons.
Link for purchasing it from Amazon: http://www.amazon.com/gp/product/B00321SMG8/ref=atv_feed_catalog?tag=imdb-amazonvideo-20
As mentioned in an earlier TSP post (TSP in R), super simple data manipulation ability of R is just beautiful. You can input GPS coordinates and accurately calculate the distance matrix based on great-circle distance with a single function spDists().
library(sp) library(tspmeta) city <- subset(USCA312_coords, lat <44 & lat>25) coords.df <- data.frame(long=city$long, lat=city$lat) coords.mx <- as.matrix(coords.df) # Compute great-circle distance matrix dist.mx <- spDists(coords.mx, longlat=TRUE) # Construct a TSP object tsp.ins <- tsp_instance(coords.mx, dist.mx ) # tour <- run_solver(tsp.ins, method="2-opt") #Plot autoplot(tsp.ins, tour)
Interestingly, exact solution to the ordering problem, or the traveling salesman problem (TSP) in common terminology, is one of the most complex problems, classified under combinatorial optimization. Traveling to n cities (vertices) requires checking (n-1)! possibilities. In my endeavor, 3,000 locations had 4*10^9131 possible solutions. This number is larger than the number of sands in any beach I know of!
TSP-R package does not provide a built-in exact solver but provides following 7 heuristic solutions:
- Nearest Neighbor
- Repetitive Nearest Neighbor
- Insertion algorithms (nearest, farthest, cheapest and arbitrary)
#1 TPS225.tsp: 225 vertices in 2D
The following code input your TSP225.csv file and outputs your solution and the visualization. The ‘tour’ object generated is a class of TOUR and integer; it contains your solution.
library(tspmeta) TSP225 <- read.csv("TSP225.csv") coords.df <- data.frame(long=TSP225$Long, lat=TSP225$Lat) coords.mx <- as.matrix(coords.df) # Compute distance matrix dist.mx <- dist(coords.mx) # Construct a TSP object tsp.ins <- tsp_instance(coords.mx, dist.mx ) # tour <- run_solver(tsp.ins, method="2-opt") #Plot autoplot(tsp.ins, tour)
Comparing solutions: Plot below shows the optimal tour length for the 7 heuristic solutions and the exact solution by Concorde. For Concorde solution, I used NEOS-Server hosted at UW-Madison.
methods <- c("nearest_insertion", "farthest_insertion", "cheapest_insertion", "arbitrary_insertion", "nn", "repetitive_nn", "2-opt") tours <- sapply(methods, FUN = function(m) run_solver(tsp.ins, method = m), simplify = FALSE) dotchart(c(sapply(tours, FUN = attr, "tour_length"), "Concorde, NEOS)" = 3916), xlab = "Tour length (n=225)", main="Heuristic Solutions vs. Exact (concorde)")
#2 3000 random vertices in 2D
Apparently, as the number of vertices grow, difference between the exact solution and other heuristic solutions grow significantly. 2-opt solutions is closest to the optimal. Repeating 2-opt solution and picking the smallest value got me really close to the exact solution (its not shown below.)
Some useful stuff:
Although I love the science of managing operations, the wording of “operations management” has never sounded charming to my ear. And I wondered why it never had a scientific naming structure such as epidemiology, pedagogy, and etc (-logy.)
The word -operations- derives from the root operatio (labor.) Today, with the availability of quantitative methods and big data sets, it is really becoming the science of understanding and guiding decisions on expending the labor optimally.
Google ngram on “operations research” would probably best represent the history of the operations study.