The Value of an Idea

What is an idea worth? I am frequently approached by friends and acquaintances with ideas for software applications. Typically, the individual believes that the idea will generate huge revenue, and they will graciously share a portion of the profit with me for my time developing the software. In no instance does the individual suggest that they will help do the development, that’s what they need me for. Furthermore, they do not have the finances to hire a developer, so their hope is that I will invest my ‘sweat equity’ into their dream project.

There are a variety of problems with this proposal. First and foremost, I have bills to pay. As such, I can’t drop paying customers for hope a future payout. Furthermore, these projects typically involve niche markets with a limited number of customers. As such, it is necessary to examine a variety of factors to determine if such an idea really has any value.

To determine that value, you must first determine an estimate of all costs involved in producing the software application. This will include development time as well as a variety of other costs. For example, licenses involved for development, deployment servers, cloud hosting costs, costs to manage the services, etc. Then, costs to market or distribute the application must be considered as well, such as the cost to Apple or Google to distribute on their mobile platforms.

After the sum of all costs is determined, the individual must make a conservative estimate of the number of units sold as well as the anticipated price point. Once that is accomplished, the profit can be determined. Multiply the unit price times the number of units sold and subtract all costs. Once that value is determined, you can have an estimate of the value of a given idea. Sadly, in many instances, that value is well below $0. Thus, many projects are simply not worth the effort because they have no return on investment.

For the developer, it is rarely useful to develop an application based on the future hope of a payout. If you do, you will very likely end up on the losing side of the equation. For the prospective client, if you feel strongly that your idea is profitable, develop a business plan. Then, convince investors of the return on investment they can hope to achieve. Once you have investors, you can then go to a development firm with a far more profitable scenario – pay for the services you want and forget about the free app development.

Technology Degrees Explained

Twenty years ago, technology degrees were divided in two categories: information technology management and computer science. However, during the last decade, universities have developed a variety of new programs to capitalize on the increased diversity of organizational technology needs. For the college student, this means trying to figure out how each of these degrees will help them meet their career objectives.

Computer Science

Computer Science degrees have been around for decades. These degree programs typically involve lots of math and an in-depth study of computer programming, algorithms, and a deep understanding of computer architectures. CS students often study more than one programming language during their program. Students with Computer Science degrees often pursue careers as in software development.

Information Technology Management

Like Computer Science degrees, Information Technology Management degrees have been around for decades. These programs are more aimed at network and system administration. Courses in Windows and Unix management as well as database are common. Additionally, students will often study network routing and switching protocols. IT majors typically manage an organization’s networks.

Cybersecurity

A relative newcomer to the realm of IT degrees, the Cybersecurity degree path is similar to Information Technology Management. However, the Cybersecurity professional has a much deeper knowledge of how to secure networks as well as how to find and exploit vulnerabilities. Additionally Cybersecurity degrees may include additional training on subjects such as law, cryptography, ethics, and risk management.

Data Analytics

Another relative newcomer, Data Analytics degrees mix knowledge of Computer Science with a deeper understanding of data. This understanding of data can then be used to enable the student to create artificial intelligence models for solving complex business problems. Like the Computer Science student, the Data Analytics student will learn programming. However, whereas a CS student may learn Java or C++, the Data Analyst is more likely to learn Python or R as well as a more in-depth study of statistics.

Software Development

Another newer degree, the Software Development degree takes a wide view of software development. While the Computer Science student dives deep into programming and algorithms, the Software Development degree path includes a much broader view of software to include project management, quality assurance as well as network management and databases.

Conclusion

While the number of paths is ever increasing, the truth is that there is a tremendous amount of overlap between each path. Additionally, whether your degree is in Computer Science, Data Analytics, or Software Development, you can find jobs as a computer programmer. However, each path will provide you with a slightly different part of the big picture of computing technologies.

Bitcoin? Seems Like Bitcon…

I’ve never been a fan of crypto currencies. A few months ago, I wrote about some of the problems with crypto currencies. However, since I’m a techie, people always expect me to be paying attention to the crypto market. So, in the fall, I “invested” about $750 dollars into crypto. Like any good investor, I diversified between a variety of currencies. How did I far? Very poorly.

As I write this, Coinbase lists the overall market as down 41% from this time last year. But surely there must be some winners? If so, they weren’t among the 12 coins that I purchased. Bitcoin is one of the worst, and is down 47%. I selected a few that I thought had actual promise because of their utility – Fetch.ai and Internet Computer. Both of them performed even more poorly and are down by more than 60%.

Risks happen in any market, so this should be a surprise. However, crypto currency advocates have insisted that these currencies would protect against inflation and that they would be immune to geopolitical events. Currently, we’re seeing the highest inflation in 40 years and are closer to WWIII than we’ve ever been with Russia’s invasion of Ukraine. As such, crypto currencies have not lived up to their hype.

Now, investment firms are considering allowing crypto currencies to be included in retirement accounts. Why? Because supporters of crypto must be seeing the true nature of crypto’s Ponzi-scheme. They finally recognize that more investors are needed to inflate the value of their crypto “investments.”

Everyday, the evidence becomes more and more clear – it should have been named Bitcon from the beginning…

Software Project Billing Models

Bookkeeping

Regardless of the type of work done by a service organization, two payment models exist: project-based and hourly. Each of these models works better for certain kinds of projects, and each model has pros and cons for both the buyer and the service-provider.

Service Billing

For simple services that can easily be estimated accurately, a project-based rate makes sense. Examples may include the cost of a car wash, painting a room, or changing the oil in a vehicle. In each of these instances, the actions of the service-provider are nearly identical with each implementation of the service provided. Furthermore, the time required is either constant (changing the oil in a car) or is easily measured based on a parameter such as room size (painting a room).

Other types of projects may be more difficult to accurately estimate. For example, gutting and remodeling a bathroom may require an estimate that is highly dependent on how the project progresses. As the contractor moves through the project, unforeseen issues may arise such electrical wiring problems or rotten floorboards that could not be known prior to the start of work.

Software Development Models

While many customers may want a project-based price, the reality is that such estimates are often inaccurate. Much like the wiring issues or rotten floorboards found by a contractor, issues often arise in software development. Furthermore, since every project is unique, developers are often forced to provided what is really nothing but an educated guess into the timeframe.

This reality has been acknowledged by most software companies as they have moved form “waterfall” to “agile” development methodologies. In waterfall, timeframes and budgets are defined before software development begins. However, companies found that these plans were rarely accurate. In fact, a common problem was budget overruns and late project delivery.

To solve this problem, companies moved to “agile” development. In this model smaller pieces of work are performed and deployed over and over again to build an application iteratively. In this model, the customer begins using the software as soon as possible and has the ability to change course as needed. For example, a customer may find that an “essential feature” is really not important once other aspects of the project are delivered. Or, they may find that an essential feature is missing which prevents required functionality.

Software Billing

Given that software quotes can be highly inaccurate, software companies and clients are left to determine how to best bill for software services. In a project-based model, the development firm takes on all the risks of providing an accurate estimate. However, knowing that estimates are frequently inaccurate, the firm will likely pad the estimate considerably to account for those issues. Furthermore, the software firm will have a vested interest in performing the least amount of work to accomplish the client’s vision. This may result in poor, unmaintainable code or buggy implementations as well as the client paying a higher overall hourly rate.

Conversely, if the software company bills on an hourly rate, they may have less of an interest in performing their job efficiently. Instead, they may want to run the clock to bill more hours. While the client has a better expectation of quality code, their bill may be inflated.

Since both models can be exploited by software development firms, it is important to find a developer you trust. Ultimately, I prefer an hourly model. This allows me to change course as the client’s needs change. I have found that customers rarely have an accurate idea of what they want. However, as the project progresses, and their vision comes into focus, clients are able to provide meaningful direction. If I’m forced to provide a project-based price, the client’s feedback will likely be ignored since their changes would constitute a change in project scope – something not allowed in a project-based model.

Conclusion

As a customer, an understanding of both models can help you interact with a software provider. While your first thought may be that the software company is trying to run up the bill with extra hours, any decent developer can provide you a ballpark figure for your project. However, know that such an estimate may be subject to change based on unforeseen problems as well as changes in your requirements as the project moves forward.

The Technology Ride

The world has gone through some amazing transformations during the last half a century. In the early 80’s computers were a rarity and cell phones were a novelty of wealthy business executives. During the 90’s, all that changed with the creation of Windows 95, which was really a pivotal point in the history of technology. Now, for the first time ever, computers were easy enough for the home user to use. A decade later, Apple would develop the iPhone followed by Google’s Android platform which would change the face of technology again. Today, computers and cell phones are ubiquitous.

My Experiences

Being born in the late 70’s, I have been able to witness this transformation first hand. In addition, I have had the incredible opportunity to take part in the creation of technologies myself. My career began in the US Army in 1995 where I served as a member of the Intelligence Community. I learned to use and administer SunOS and Solaris machines, and began my experimentation with programming. It was in this environment that I developed a love for Unix-based systems that continues to this day.

On those Unix machines, I started programming in C, C++, TCL, Perl, and Bourne Shell. While my first programs were pretty bad, I would eventually have an opportunity to write code for a classified government project. That code earned me a Joint Service Achievement Medal as well as making a profound impact within the intelligence community at the time.

After leaving the Army, I entered the civilian workforce to develop Point-of-Sale applications using C++. I would spend nearly two decades developing code for a variety of companies using various platforms and languages. I developed low-level code for phone systems, created custom Android operating systems, and programmed countless web and mobile applications.

Today

Now, I run my own business developing software for clients and creating artificial intelligence solutions. But when I look back, I am always amazed at how far the technology revolution has brought us, and I am thankful that I have had a chance to be a part of that revolution! With 20 years left before I retire, I can’t imagine where technology will take us tomorrow. Yet, I can’t imagine not being a part of that future!

For those who were born in the 90’s or in the new millennium, you will never know how much the world has changed. But for my generation, we watched it happen – and many of played a part in making it happen!

Overview of CompTIA Certifications

A variety of computer certifications exist today. Those certifications fall into one of two categories – vender-neutral or vendor-specific. In the vendor-neutral category, CompTIA is the industry leader. Most well-known for their A+ certification, CompTIA has been around for 40 years and certified over 2.2 million people.

Today, CompTIA issues over a dozen IT certifications for everything from computer hardware to project management. Beyond single certifications, CompTIA also offers what it calls ‘Stackable Certifications’. These certifications are earned by completing multiple CompTIA certifications. For example, earning both A+ and Network+ certifications will result in achieving CompTIA IT Operations Specialist certification.

Hardware Certifications

Individuals who want to work with computer hardware maintenance and repair should start with the A+ certification. This exam covers basic computer hardware and Windows administration tasks. For anyone wanting to work with computers, this exam covers the fundamental knowledge required for success.

Once you have mastered computer hardware, the next step is computer networks. This knowledge is covered by CompTIA’s Network+ certification. Topics in this exam include both wireless and wired network configuration, knowledge of switches and routers, and other topics required for network administration. Note, this exam is vender-neutral. As such, knowledge of specific routers (such as Cisco) is not required.

Security Certifications

CompTIA offers a variety of security certifications for those who wish to ensure their networks are secure or to test network security. The first exam in this category is the Security+ exam. This exam covers basics of security including encryption, WiFi configuration, certificates, firewalls, and other security topics.

Next, CompTIA offers a variety of more in-depth security exams on topics such as penetration testing (PenTest+), cybersecurity analysis (CySA+) and advanced security issues (CASP+). Each of these exams continue where the Security+ exam ends and requires a far more extensive knowledge of security. With all of the security issues in the news, these certifications are in high demand among employers.

Infrastructure Certifications

CompTIA offers several tests in what it calls the ‘infrastructure’ category. These exams are particularly useful for people who administer cloud systems or manage servers. Certifications in this category include Cloud+, Server+, and Linux+. If your organization utilizes cloud-based platforms, such as AWS or Google Cloud Platform, these certifications provide a vendor-neutral starting point. However, if you really want to dive deep into topics like AWS, Amazon offers numerous exams specifically covering their platform.

Project Management Certification

While not hardware related CompTIA offers an entry-level certification for project management called Project+. This exam is less detailed and time consuming than other project management certifications but covers the essential details of project management.

Conclusion

For the aspiring techie or the individual looking to advance their career, CompTIA provides a number of useful certifications. While certifications from other vendors may cost thousands of dollars, CompTIA exams are generally under $400. This is money well spent in the competitive IT world as CompTIA is one of the most respected names in vendor-neutral IT certifications.

Apple vs Android – A Developer’s Perspective

While most applications are developed for both iPhone and Android systems, new developers are faced with the choice of which platform to learn. While both Android and iPhone systems offer excellent apps as well as a variety of sizes, they differ considerably from a developer perspective.

Android Development

For the novice, Android development is probably the easier entry point. For starters, low end Android phones are cheaper to purchase than iPhones. But more importantly, Android developers can use Windows, Linux, or Mac machines for development. So, if you have a computer and an Android phone, you can get started right away.

The language used on Android phones is Java or Kotlin. While Kotlin is the newer language, more resources on Java development are available to get started. Furthermore, once you learn Java, you will find other development opportunities open up to you – such as backend services using frameworks such as Spring Boot.

Once you have learned how to program Android phones, you will find that other devices use Android as well. This includes Virtual Reality hardware such as Oculus, Augmented Reality glasses from vendors like Vuzix, and smart watches.

Publishing to Google is relatively simple too. Once you pay a one-time fee, you are a licensed developer and can create and deploy applications to the Google Play store. While there is some over sight from Google, it is less burdensome than Apple’s requirements.

iPhone Development

iPhone development is a little more complicated. For starters, you will need a Mac machine as the tools for iPhone development do not run under Windows or Linux. Furthermore, both Apple computers and iPhones tend to be more expensive for a small development setup.

While Android’s Java language is used everywhere, the iPhone’s Swift language is far more limited. In fact, Swift isn’t used outside of the Apple ecosystem. So, if you chose to develop other services to integrate with your phone, you will need to learn an additional language.

Unlike Android, few devices run iOS. Thus, your skills on iPhone development will not translate to the ability to program other devices aside from the Apple Watch.

Finally, Apple’s App Store is far more expensive and burdensome than the Google Play Store. For starters, Apple requires developers to pay an annual license fee – which is more expensive than Google’s one-time license cost. Furthermore, the Apple Store is much more strict with requirements for apps and provides significantly more oversight on the app market.

Conclusion

While I think both the Apple and Android phones are excellent, I personally find the Android developer experience to be more positive. This is particularly true for the indie developer or individual looking to learn mobile development.

What is Computer Vision?

Computer Vision is a rapidly growing technology field that most people know little about. While Computer Vision has been around since the 1960s, it’s growth really exploded with the creation of the OpenCV library. This library provides the tools necessary for software engineers to create Computer Vision applications.

But what is Computer Vision? Computer Vision is a mix of hardware and software tools that are used to identify objects from a photo or camera input. One of the more well-known applications of Computer Vision is in self-driving cars. In a self-driving car, numerous cameras collect video inputs. The video streams are then examined to find objects such as road signs, people, stop lights, lines on the road, and other data that would be essential for safe driving.

However, this technology isn’t just available in self-driving cars. A vehicle I rented a few months ago was able to read speed limit signs as I passed by and display that information on the dash. Additionally, if I failed to signal a lane change, the car would beep when I got close to the line.

Another common place to find Computer Vision is in factory automation. In this setting, specialized programs may monitor products for defects, check the status of machinery for leaks or other problematic conditions, or monitor the actions of people to ensure safe machine operation. With these tools, companies can make better products more safely.

Computer Vision and Artificial Intelligence are also becoming more popular for medical applications. Images of MRI or X-Ray scans can be processed using Computer Vision and AI tools to identify cancerous tumors or other problematic health issues.

From a less practical view, Computer Vision tools are also used to modify videos of user content. This may include things such as adding a hat or making a funny face. Or, it may be used to identify faces in an image for tagging.

Ultimately, Computer Vision technologies are being found in more and more places each day and, when coupled with AI, will ultimately result in a far more technologically advanced world.

What is the Dark Web?

Most people have heard of the Dark Web in news stories or tech articles. But what is it? How does it work? Is it worth visiting?

The Dark Web is a hidden network of highly encrypted machines available over the internet, but not using a typical web browser. While any content can be stored on the Dark Web, the majority of the content is of a questionable nature – such as child pornography, snuff films, drug and fake ID stores, and similar content. One such marketplace, Dark Amazon, provides users an Amazon-like shopping experience.

However, not everything on the Dark Web is illegal or unethical. In fact, the Dark Web can be a very useful tool for individuals in China to access Facebook (they have a Dark Web site) or for intelligence operatives in Iran to contact the CIA (they’re on the Dark Web too).

In short, the Dark Web is a useful tool if you are trying to remain anonymous or operating within a country that has strict internet controls. But what do you need to get started? Simple – download a TOR browser. TOR stands for The Onion Router. The idea of an onion is that there are multiple layers of encryption and that the traffic is routed through numerous machines to prevent tracking. TOR browsers exist for all major platforms – including mobile – and are really no different than any other web browser.

The real challenge, however, is finding content. For that, you will need a Dark Web search engine – such as TOR66. However, a few suggestions before you give it a try. First, run a VPN. While there is nothing illegal with using a TOR browser, you may draw suspicion from your ISP and it is always possible that TOR users are being watched by law enforcement. Second, make sure you have antivirus up-to-date on your machine – you never know what’s out there. Third, trust no one. The Dark Web is a place of thieves, con artists, drug dealers, and other people with questionable ethics.

Productivity Gains through Aliases & Scripts

For users of Mac or Linux-based machines, aliases and scripts can create some of the most valuable tools for increased productivity. Even if you run a Windows machine, there is a strong possibility that some machines you interact with – such as AWS – utilize a Linux-based framework.

So, what are aliases and scripts? Scripts are files that contain a sequence of instructions needed to perform a complex procedure. I often create scripts to deploy software applications to development server or to execute complex software builds. Aliases are much shorter, single line commands that are typically placed in a system startup file such as the .bash_profile file on MacOS.

Below are some of the aliases I use. Since Mac doesn’t have an hd command (like Linux), I have aliased it to call hexdump -C. Additionally, Mac has no command for rot13 – a very old command to perform a Caesar cipher which I have aliased to use tr.

Since I spend a lot of time on the command prompt, I have crated a variety of aliases to shortcut directory navigation including a variety of up command to move me up the directory hierarchy (particularly useful in a large build structure) and a command to take me to the root folder of a git project.

Finally, I have a command to show me the last file created or downloaded. This can be particularly useful, for example, to view the last created file I can simply execute cat `lastfile`.

alias hd='hexdump -C'
alias df='df -h'
alias rot13="tr 'A-Za-z' 'N-ZA-Mn-za-m'"
alias up='cd ..'
alias up2='cd ../..'
alias up3='cd ../../..'
alias up4='cd ../../../..'
alias up5='cd ../../../../..'
alias up6='cd ../../../../../..'
alias root='cd `git rev-parse --show-toplevel`'
alias lastfile="ls -t | head -1"

One common script I use is bigdir. This script will show me the size of all folders in my current directory. This can help me locate folders taking up significant space on my computer.
#!/bin/bash

SAVEIFS=$IFS
IFS=$(echo -en "\n\b")

for file in `ls`
do
        if [ -d "$file" ]
        then
                du -hs "$file" 2> /dev/null
        fi
done
IFS=$SAVEIFS

Another script I use helps me find a text string within the files of a folder.
#!/bin/bash

SAVEIFS=$IFS
IFS=$(echo -en "\n\b")

if [ $# -ne 1 ]
then
        echo Call is: `basename $0` string
else
        for file in `find . -type f | cut -c3-`
        do
                count=`cat "$file" | grep -i $1 | wc -l`
                if [ $count -gt 0 ]
                then
                        echo "******"$file"******"
                        cat "$file" | grep -i $1
                fi
        done
fi
IFS=$SAVEIFS

These are just a few examples of ways to use scripts and aliases to improve your productivity. Do you have a favorite script or alias? Share it below!