Overview of CompTIA Certifications

A variety of computer certifications exist today. Those certifications fall into one of two categories – vender-neutral or vendor-specific. In the vendor-neutral category, CompTIA is the industry leader. Most well-known for their A+ certification, CompTIA has been around for 40 years and certified over 2.2 million people.

Today, CompTIA issues over a dozen IT certifications for everything from computer hardware to project management. Beyond single certifications, CompTIA also offers what it calls ‘Stackable Certifications’. These certifications are earned by completing multiple CompTIA certifications. For example, earning both A+ and Network+ certifications will result in achieving CompTIA IT Operations Specialist certification.

Hardware Certifications

Individuals who want to work with computer hardware maintenance and repair should start with the A+ certification. This exam covers basic computer hardware and Windows administration tasks. For anyone wanting to work with computers, this exam covers the fundamental knowledge required for success.

Once you have mastered computer hardware, the next step is computer networks. This knowledge is covered by CompTIA’s Network+ certification. Topics in this exam include both wireless and wired network configuration, knowledge of switches and routers, and other topics required for network administration. Note, this exam is vender-neutral. As such, knowledge of specific routers (such as Cisco) is not required.

Security Certifications

CompTIA offers a variety of security certifications for those who wish to ensure their networks are secure or to test network security. The first exam in this category is the Security+ exam. This exam covers basics of security including encryption, WiFi configuration, certificates, firewalls, and other security topics.

Next, CompTIA offers a variety of more in-depth security exams on topics such as penetration testing (PenTest+), cybersecurity analysis (CySA+) and advanced security issues (CASP+). Each of these exams continue where the Security+ exam ends and requires a far more extensive knowledge of security. With all of the security issues in the news, these certifications are in high demand among employers.

Infrastructure Certifications

CompTIA offers several tests in what it calls the ‘infrastructure’ category. These exams are particularly useful for people who administer cloud systems or manage servers. Certifications in this category include Cloud+, Server+, and Linux+. If your organization utilizes cloud-based platforms, such as AWS or Google Cloud Platform, these certifications provide a vendor-neutral starting point. However, if you really want to dive deep into topics like AWS, Amazon offers numerous exams specifically covering their platform.

Project Management Certification

While not hardware related CompTIA offers an entry-level certification for project management called Project+. This exam is less detailed and time consuming than other project management certifications but covers the essential details of project management.

Conclusion

For the aspiring techie or the individual looking to advance their career, CompTIA provides a number of useful certifications. While certifications from other vendors may cost thousands of dollars, CompTIA exams are generally under $400. This is money well spent in the competitive IT world as CompTIA is one of the most respected names in vendor-neutral IT certifications.

Apple vs Android – A Developer’s Perspective

While most applications are developed for both iPhone and Android systems, new developers are faced with the choice of which platform to learn. While both Android and iPhone systems offer excellent apps as well as a variety of sizes, they differ considerably from a developer perspective.

Android Development

For the novice, Android development is probably the easier entry point. For starters, low end Android phones are cheaper to purchase than iPhones. But more importantly, Android developers can use Windows, Linux, or Mac machines for development. So, if you have a computer and an Android phone, you can get started right away.

The language used on Android phones is Java or Kotlin. While Kotlin is the newer language, more resources on Java development are available to get started. Furthermore, once you learn Java, you will find other development opportunities open up to you – such as backend services using frameworks such as Spring Boot.

Once you have learned how to program Android phones, you will find that other devices use Android as well. This includes Virtual Reality hardware such as Oculus, Augmented Reality glasses from vendors like Vuzix, and smart watches.

Publishing to Google is relatively simple too. Once you pay a one-time fee, you are a licensed developer and can create and deploy applications to the Google Play store. While there is some over sight from Google, it is less burdensome than Apple’s requirements.

iPhone Development

iPhone development is a little more complicated. For starters, you will need a Mac machine as the tools for iPhone development do not run under Windows or Linux. Furthermore, both Apple computers and iPhones tend to be more expensive for a small development setup.

While Android’s Java language is used everywhere, the iPhone’s Swift language is far more limited. In fact, Swift isn’t used outside of the Apple ecosystem. So, if you chose to develop other services to integrate with your phone, you will need to learn an additional language.

Unlike Android, few devices run iOS. Thus, your skills on iPhone development will not translate to the ability to program other devices aside from the Apple Watch.

Finally, Apple’s App Store is far more expensive and burdensome than the Google Play Store. For starters, Apple requires developers to pay an annual license fee – which is more expensive than Google’s one-time license cost. Furthermore, the Apple Store is much more strict with requirements for apps and provides significantly more oversight on the app market.

Conclusion

While I think both the Apple and Android phones are excellent, I personally find the Android developer experience to be more positive. This is particularly true for the indie developer or individual looking to learn mobile development.

What is Computer Vision?

Computer Vision is a rapidly growing technology field that most people know little about. While Computer Vision has been around since the 1960s, it’s growth really exploded with the creation of the OpenCV library. This library provides the tools necessary for software engineers to create Computer Vision applications.

But what is Computer Vision? Computer Vision is a mix of hardware and software tools that are used to identify objects from a photo or camera input. One of the more well-known applications of Computer Vision is in self-driving cars. In a self-driving car, numerous cameras collect video inputs. The video streams are then examined to find objects such as road signs, people, stop lights, lines on the road, and other data that would be essential for safe driving.

However, this technology isn’t just available in self-driving cars. A vehicle I rented a few months ago was able to read speed limit signs as I passed by and display that information on the dash. Additionally, if I failed to signal a lane change, the car would beep when I got close to the line.

Another common place to find Computer Vision is in factory automation. In this setting, specialized programs may monitor products for defects, check the status of machinery for leaks or other problematic conditions, or monitor the actions of people to ensure safe machine operation. With these tools, companies can make better products more safely.

Computer Vision and Artificial Intelligence are also becoming more popular for medical applications. Images of MRI or X-Ray scans can be processed using Computer Vision and AI tools to identify cancerous tumors or other problematic health issues.

From a less practical view, Computer Vision tools are also used to modify videos of user content. This may include things such as adding a hat or making a funny face. Or, it may be used to identify faces in an image for tagging.

Ultimately, Computer Vision technologies are being found in more and more places each day and, when coupled with AI, will ultimately result in a far more technologically advanced world.