The Up and Coming
“promising continued or future success; enterprising”
It’s not always easy to keep track of the many different practice areas around Computech. With our broad expertise, it can be somewhat of a challenge to stay abreast of everything we’re accomplishing. So we’re here to keep you up to speed with what’s going on with Computech at this very moment and in the near future.
Right now our Auctions team is providing hands-on training at the Middle East Spectrum Conference in Dubai. At the conference on Tuesday, March 30, is the “Greater Flexibility in Spectrum Management” program, at which Computech’s auction expert Kris Shields will speak on the “Best Practice in Spectrum Auctions.” Fresh off the successful 800 MHz Swedish Spectrum auction and the aforementioned Middle East Spectrum Conference, the Auction team will then begin preparing for the 6th Annual European Spectrum Management Conference in June, for which Computech is a gold sponsor.
In non-auctions related news, Computech’s Chris Merkel will be speaking at the sold-out Midwest UX Conference in Columbus, Ohio. The conference, which takes place on April 9 and 10, finds Chris speaking on Practical Accessibility, which will include a workshop that seeks to expand participants’ knowledge outside of the “industry standard” checklists, and will provide in-person and hands-on demonstrations of a screen reader and a brailler printer, details of current and upcoming federal and legal regulations, and overviews of recent legal battles in courts.
In other news, last week Computech found out that we have won the Recompete contract at the FCC. As part of the new contract, Computech will be expanding our support of mission-critical systems by managing the Tracking and Administrative towers of FCC’s portfolio in addition to the Licensing tower we manage today. As a company, we are extremely proud that the FCC placed its trust in us once again.
Finally, as a quick reminder, Computech is currently seeking to fill positions including Project Assistant, Project Coordinator, Technical Writer, and Java Developer. Any inquiries or referrals can be directed towards Carrie Gaut.
Posted on Tuesday, March 29th, 2011 at 10:24 am and is filed under FCC, Jobs, Online Auctions.
By: Darren Vandergriff
Juan Marin talks Broadband Map Development
Entrevista Juan Marín : Aspectos técnicos del National Broadband Map
Por Alberto Santos
6 de marzo de 2011
DME: Hemos leído que el software empleado es completamente OpenSource ¿porqué el empleo de herramientas OpenSource? ¿Crees que si hubieseis empleado software’s comerciales habríais mejorado el rendimiento final del trabajo?
DME: En cuanto a cada uno de los componentes ¿qué Gestor de base de datos habéis empleado? ¿qué herramientas para el diseño del interfaz de mapas?
J.M: La base de datos principal es PostgreSQL con la extensión espacial Postgis para datos geográficos. La capa de presentación es fundamentalmente OpenLayers, y usamos Geoserver como servidor GIS y GeoWebCache para el cache de mapas. Todo el website esta construido sobre WordPress.
DME: Ya comentaste que durante las primeras horas y debido al enorme éxito la aplicación cayó por “exceso de tráfico”. ¿cómo rehicisteis la arquitectura de los cloud services para recuperar el servicio?
J.M: Sabíamos que este era un proyecto importante y que generaría interés, pero nadie estaba preparado para tener que atender picos de hasta 9000 peticiones por segundo. Cuando sirves tus propios caches de mapas se genera mucho estrés en los servidores, especialmente si se usan como los hemos usado nosotros, para describir el contenido en cuestión y no solo como un mapa base. La solución a este problema vino a trabes de una estrategia de escalado horizontal, básicamente poniendo mas servidores para atender el servicio. Tuvimos varias oleadas de peticiones a medida que se iba publicando en los medios (CNN, Fox News, The New York Times, etc.) lo que hizo la intervención complicada y mas larga de lo que hubiésemos deseado.
DME: A lo largo de tantos meses de trabajo ¿Cuáles han sido los retos más importantes a los que ha hecho frente un proyecto de estas características?
J.M: El reto mas importante siempre fue cumplir con la fecha del 17 de Febrero, que era un mandato del Congreso y por lo tanto inamovible. Pero este no fue el reto mas difícil, el mas complicado fue la integración de datos. Como suele pasar en muchos proyectos GIS, integrar datos de proveedores dispares como era el caso ocasiona muchos problemas incluso cuando se ha definido un modelo de datos común. Tenemos muchas ideas, algunas novedosas, sobre como mejorar este proceso en la siguiente ronda. Hay que tener en cuenta que el Mapa esta sirviendo información de todo Estados Unidos, tenemos algunas capas geográficas con 25 millones de polígonos que describen el acceso a banda ancha con un grado de detalle bastante elevado y en un país muy grande. Esta información se va a duplicar cada 6 meses, axial que nuestra atención esta ahora enfocada a como mejorar este proceso de integración.
DME: La aplicación usa como fondo cartográfico la cartografía de OpenStreet Map.¿Cómo ha sido la experiencia de integración?¿Está preparado OpenStreet Map para dar respuesta a una aplicación de uso masivo?
J.M: Este proyecto utiliza Cloudmade, un servicio basado en OpenStreetMap con algunas características muy atractivas. Una de ellas, que proporcionan soporte comercial. Ellos también sufrieron el aluvión de peticiones los dos primeros días, pero he de decir que su respuesta ha sido excepcional, y en unas pocas horas tenían una solución que escalaba sin problemas. En mi opinión esta opción es muy atractiva debido a la posibilidad de adaptar el mapa base a un cierto estilo, pero sin tener que preocuparse de trabajar con los datos de OpenStreetMap.
La url del proyecto http://www.broadbandmap.gov/
Interview Juan Marin: Technical aspects of the National Broadband Map
By Alberto Santos
March 6, 2011
DME: We read that the employee is fully OpenSource software why the use of Open Source tools? Do you think if you had used commercial software’s end you would have improved the performance of work?
DME: For each of the components which database manager you employed? What design tools to map interface?
JM: The primary database is PostgreSQL with the spatial extent Postgis for geographic data. The presentation layer is mainly OpenLayers, Geoserver as a server and use GIS and GeoWebCache to cache map. The entire website is built on WordPress.
DME: We discussed that during the first hours and due to the huge success of the application fell for “excessive traffic”. How to redo the architecture of cloud services to retrieve the service?
JM: We knew this was an important project and to generate interest, but no one was prepared to have to meet peaks of up to 9000 requests per second. When you serve your own maps caches a lot of stress on the servers, especially if used as we have used to describe the content in question and not just as a map base. The solution to this problem came to beams of a horizontal scaling strategy, basically putting more servers to service. We had several waves of demands as it was published in the media (CNN, Fox News, The New York Times, etc.) That the intervention was complicated and longer than we would have desired.
DME: Over many months of work What have been the biggest challenges that has faced a project like this?
JM: The biggest challenge was always meet the deadline of February 17, which was mandated by Congress and therefore unchangeable. But this was not the most difficult challenge, the more complicated was the integration of data. As often happens in many GIS projects, integrate disparate data providers as was the case causes many problems even when we have defined a common data model. We have many ideas, some new, on how to improve this process in the next round. Keep in mind that the map is serving information from around the United States, we have some geographic layers with 25 million polygons that describe the broadband access with a fairly high degree of detail and a very large country. This information will double every 6 months, axial our attention is now focused on how to improve this integration process.
DME: The application uses the background map Map OpenStreet mapping. How was the experience of integration? OpenStreet Map Are you prepared to respond to a mass market application?
JM: This project uses CloudMade, a service based on OpenStreetMap with some very attractive features. One of them, which provide commercial support. They also suffered a barrage of requests the first two days, but I will say that their response has been exceptional, and in a few hours had a solution that will scale without problems. In my opinion this option is very attractive because of the possibility of adjusting the map based on a certain style, but without having to worry about working with data from OpenStreetMap.
The project url http://www.broadbandmap.gov/
Posted on Monday, March 14th, 2011 at 3:56 pm and is filed under Broadband Map, News.
By: Darren Vandergriff
The National Broadband Map: Data Analysis
by Paul Salasznyk, Ph.D.
The unveiling of the National Broadband Map exemplifies the government-wide initiative of data transparency and accountability. Collected semi-annually, at an unprecedented level of granularity from 1,650 broadband providers in all 50 states, 3 territories and the District of Columbia, the data informs the public on the availability of broadband in their locale and creates opportunities for exploratory data analysis. We included the results of our initial analysis to date in the “Analyze” and “Maps” portions of the Map, but the richness of the data lends itself to further analysis. Consider the vastness of the provider, technology and speed data included in the Map:
- 13.2 million census block records
- 3.4 million address points
- 4.6 million street segments
- 67,000 wireless shape polygons (transformed into 46.9 million census block records)
- 329,000 community anchor institutions
- 1 million FCC Broadband Quality Test Points
- 20 + million quality assessment records
- Census Bureau demographics data
The National Telecommunications and Information Administration (NTIA) and Federal Communications Commission (FCC) realized that the complexity and sheer size of this dataset would create challenges during the data collection, organization and analysis process. The NTIA and FCC entrusted Computech with not only building the Map application, but with solving this data quandary. To address these complexities, we created a data transformation model. This model produced a series of cleansed, practical data tables by transforming the seemingly disparate data into a geographic “common denominator” and incorporating various broadband analytics use cases. Using a combination of SQL scripts and Statistical Analysis Software (SAS), we then used these tables to calculate detailed provider, technology and speed availability statistics. This model alleviated most data problems and will also serve as the basis for future analytics.
The first phase of the project culminated in February’s unveiling of the Map. States will submit updated data semi-annually, the Map will continue to grow and evolve. Capitalizing on our deep understanding of the broadband data, we plan to conduct deeper analytics, including multivariate, time-series, and correlation analysis. Our hope is to find new ways to highlight the data in the Map so that others may use it as a decision support tool, spurring data-driven decision making.
The growth of broadband has the potential to drive significant economic growth and competitiveness. Its implications span education, health care, public safety and government. Deepening our understanding of the broadband data will require us to analyze it in concert with data from these and other sectors of society. Computech is well positioned to take on this task and is committed to working towards achieving this understanding.
Posted on Friday, March 11th, 2011 at 11:55 am and is filed under Broadband Map.
By: Paul Salasznyk
CAP in Action: Computech’s Auction Platform Raises 323m in Sweden Auction
The 800MHz Auction, hosted by the Swedish Post and Telecom Agency (PTS) was successfully completed on Friday. The auction, which began last Monday, assigned licenses in the 800 MHz band and was concluded after almost five days of bidding and 31 bidding rounds.
The 800 MHz band is attractive for many different kinds of services, such as wireless broadband or mobile telephony. Computech, which created and maintains PTS’ auction processes, was at the event which showcased a customized version of Computech’s Auction Platform (CAP) and auctioned off the Swedish government’s broadband spectrum.
Last November, PTS awarded Computech the contract for auction consulting services and software in support of their 800 MHz band auction. Friday, after 5 days, PTS raised $323.7 million through the auction.
Posted on Monday, March 7th, 2011 at 12:31 pm and is filed under Online Auctions.
By: Darren Vandergriff
A Look at The National Broadband Map’s Architecture
Just a couple of weeks ago, National Telecommunications and Information Administration (NTIA) and the Federal Communications Commission (FCC) launched the National Broadband Map, a website that allows consumers to retrieve broadband information for a particular area, including a list of the providers available and the advertised speeds offered. One of the areas of this development effort that presented a challenge was how to convey, in a simple yet effective way, the amount of information available, and make it accessible through an online mapping application.
While laying out the architecture of The National Broadband Map, the following aspects were taken as design guidelines:
- the online map would have to be as standards-friendly as possible
- every request made through the site should lend itself to being bookmarked, following the REST principles of the Internet
- speed was an important factor
In order to accomplish these principles, the Computech team designed an architecture, both for the mapping display as well as for information retrieval, that was easy to use for end users and web developers alike – both internal and external, since all Application Program Interfaces (APIs) used on the site are published for third parties to consume.
We used a three-tier architecture both for the mapping services as well as for all the APIs that support the website and external developers, so that we could support high throughput and scale up. A relational database with spatial extensions (PostGIS) was utilized to store information, including some large geographic layers that represent providers’ coverage and that in some cases reached 25 million records (polygons). The middle-tier was responsible for exposing this information through a RESTful layer of APIs that are exposed to the public, so that any third party developer can mash up broadband content on their applications. These APIs are fully documented here.
For the mapping services, we used Geoserver as the technology responsible for exposing Open Geospatial Consortium (OGC) compliant web mapping services; extensive usage of caching was necessary in order to be able to have a fast mapping display on a web browser; the team also prepared the geospatial data by generalizing and setting scale dependencies for several layers, making the online maps more efficient.
Standards have been a great asset in architecting and developing the National Broadband Map. Our experience tells us that they do work and form loosely coupled architectures where small pieces of software can perform their specialized task with great efficiency, all working together in a bigger system. This approach makes an agile development methodology possible, and permits quick changes to the system, which evolves more naturally as requirements and priorities change. Building systems like this gives the developers a lot of flexibility, an aspect enhanced by the fact that this effort was entirely built using open source software, which is usually built on top of commonly used standards.
Posted on Friday, March 4th, 2011 at 3:57 pm and is filed under Broadband Map.