How to avoid phishing scams while working from home
- Waldo Nell
- Tech for Execs
- minute(s)Working from home, especially for the first time, may seem a bit like leaving home as a young adult. You enter a brave new world, free of the protections your parents' home provided to discover yourself. In this case, the office is the secure home you are leaving, at least from an IT security perspective. One of the security issues to consider when working remotely is phishing. This article defines the problem, provides a real-world example and eight steps to avoid phishing attempts. Phishing The definition of a phishing attack: A fraudulent attempt by an adversary posing as a legitimate entity to steal sensitive information from someone. The problem Chances are you have seen emails from someone familiar asking you to click on a link or download an attached file. With some review, you realized the person who sent it is not the person they claimed to be. The idea behind this attack is to gain your trust by posing as someone you know. The basic process is shown below: The adversary's phishing email causes the user to connect to the malicious site (red connection) controlled by the adversary, instead of the legitimate web site (green connection). This allows the adversary to intercept the login credentials of the user. Below is a real-world scenario that we will be analyzing. All the suspicious elements visible to a non-technical user have been highlighted in green. It purports to be from our phone system, informing me that I have a new voicemail. We analyze it below. The email has a file attached. Carefully reviewing that attachment shows it to be an HTML file. If this were a real voicemail attachment, it would be a WAV or Mp3 file. Another approach, not used in this message, includes a link the user is asked to click to perform some action. This link might look like the one below: If the user hovers their mouse cursor over the link "Release Message", the actual URL that would be opened is displayed (as highlighted above). Often it is clear that this URL points to a domain that is not related to the sender of the email. The FROM address is not from any known domain and does not match the name of the sender. The subject is suspicious. What does a "protected recording" mean? Misuse of the Importance flag. Identifying an email as High Importance is a common tactic among phishing attacks. Font substitution is also a common technique used to try and bypass spam filters. It works by randomizing some letters to look like valid English letters. Since the Greek ε is not the same as the Latin e, it could fool anti-spam algorithms. What happens if the user doesn't notice any of the above warning signs? In this example, the following occurs: If the user double clicks on the attachment named _NewAudioMessageFile_000.htm. their browser loads the contents of that HTML file locally. A page is loaded that redirects to a new website (owned by Mary Jean Suarez in the Philippines) that looks like this: Note this is not a Microsoft URL but was made to looks like a Microsoft page to appear even more legitimate. If the user stops here, the adversary knows the email address that received the email is active and that the link was clicked. No compromise of any account has occurred yet. If the user enters their password (correct or incorrect) and clicks Sign In, the password, together with the email address, is sent to the adversary's server (called credential harvesting). The following page is displayed: As the adversary does not know if the supplied password is valid or not, the best they can do is to forward the user to the real portal. The page then redirects to this legitimate Microsoft web page: Most users would not think twice about having to log in again, assuming perhaps they mistyped something. This time, the user is logging in to the real Microsoft web site. The scam is completed. If the user entered their correct password in step 4 above, their account has been compromised, and the adversary now has full access to the account. You should immediately change your password and monitor your account for suspicious activity. The mitigation Phishing attacks are nothing new. But they have increased 350% during the pandemic and are now more dangerous than ever because: Your computer is no longer offered all the protections of the office network, You have no direct access to IT to question suspicious emails/activities, If you are using your home computer, it is likely un-patched, making it easier for phishing attempts to work successfully, Because we are working in unprecedented times and are likely working remotely, requests that would have stood out as unusual two months ago might slip by unnoticed today. What can you do to protect yourselves? Below are eight recommended steps to take to increase your protection: Realize that it is inevitable that you will receive many phishing attempts, and you have a responsibility to be vigilant. Know how to spot a phishing attempt. This can be very complex and technical; however, there are several markers you can watch out for to identify most phishing attempts: Email FROM addresses can be easily faked. Just like in our example above, phishing attempts will get the name in the FROM field of an email to reflect someone you know but not the email address. This is not universally true, but it is a good first warning sign. If the email asks you to click on a link, pause for 10 seconds, and carefully consider whether the email was expected. If not, contact your IT department to verify the validity before clicking any links. An additional safety measure you can take is to hover your mouse over the link in the email. Make sure the domain in the link (i.e. the site it is pointing to) seems familiar. If the email asks you to perform any sensitive operations like releasing payment etc., it is best to call the sender via a known number and verify the request. Disable automatic loading of images. Most email applications allow you to turn off automatic loading of images in emails. This is recommended as many phishing attempts work by including a link to an image that lets the attacker know your email is legitimate. Another useful trick is not to click the link given in the email, but rather open a new browser window and go to the site in question yourself. For instance, if you get an email about a Microsoft Office 365 account needing updated payment details, don't click the link in the email. Instead, open a new browser window and type in https://portal.office.com and go to the billing section yourself. This will thwart the phishing attempt. Sometimes this is not possible, such as when the email contains a link to a file you need to view. Enable Two Factor Authentication (2FA) on all critical systems. If this was enabled in our example above, the adversary would only have your password but would be unable to log in. Be sure your computer is set to download and apply patches automatically. Applications are often adding features to detect and warn users about suspect emails automatically. Finally, never allow someone to connect remotely to your computer unless you know them. While there is no single thing you can do to prevent a phishing attempt from succeeding, these basic guidelines can greatly mitigate the risk of falling prey to a phishing attempt.
Working remotely exposes us to increased security risks and the pandemic has seen a 350% increase in phishing attempts. These eight steps can protect you.
READ MORE
Improve your ability to work remotely: Understanding VPNs
- Waldo Nell
- Tech for Execs
- minute(s)Given the explosion in demand for remote work, we wanted to explain a technology that many of us use, but few in finance understand well. If you are working from home and connect to the office, you likely use a VPN. VPN stands for Virtual Private Network and is a technology that is as old as the internet itself. While working from home you may have experienced applications running slowly or not at all. This can be frustrating, to say the least. As we have said elsewhere, a little technological insight can improve your ability to discuss your problems with IT and ultimately yield better performance and improve your ability to mitigate risk. That is our goal for this article. Overview To understand how a VPN works at a high level, one needs to have a basic understanding of networks. When you are at your corporate office, you will most likely be connecting to the corporate LAN (Local Area Network). The LAN is just a name for a network that is not accessible directly from the internet and connects a limited number of computers and other resources located in close physical proximity to one another, together (such as those at your office location). Each organization will have at least one LAN of its own to which its computers are connected. Each LAN connects to the internet, which is in itself a type of network (WAN - Wide Area Network). Generally speaking, one LAN cannot talk to another LAN, even though both LANs can (usually) connect to the internet. This is intentional, to isolate traffic and to help protect your resources. Consequently, when you are at home (the network at your home is also considered a LAN) and you need to access resources from your office network, such as a Remote Desktop server, network drives, etc. you need tools to first connect to the office LAN. This is where a VPN comes to the rescue. Assuming your IT department allows remote connections, IT would typically: install some software on the corporate network that will listen for VPN connections and install some VPN client software on your laptop/home computer (depending on the type of VPN). With this work complete, the client software can connect to the server software at the office and create a virtual, encrypted tunnel. All traffic destined for your office that would previously fail to move between the two LANs now flows through this virtual tunnel network. It is important to understand that traffic flows from your LAN over the public internet to the office LAN and back again. That is why it is crucial that VPN has the P in its abbreviation - all data is encrypted so that nobody on the internet can intercept and decode the traffic. The diagram below shows the basic configuration of a typical VPN: Finally, this VPN is temporary. The moment you disconnect, the tunnel closes, and the two LANs are decoupled again. Performance Although working remotely via a VPN is very similar to working locally at the office from a practical point of view, some major differences are important to understand to maximize your effectiveness and minimize frustration as it relates to performance and security. We understand that many of you reading this only need a cursory overview. To really participate in a discussion with IT, however, you will need to know more. For those who just need high-level, we provide that first. Then for those with the patience and interest, we provide much more detail further on. In Summary If you are working from home over a VPN and experience poor "performance" there are 5 main items to consider: Throughput - the speed (both download and upload speeds) of your home network are likely to be slower than your office LAN. To test your speed, visit www.speedtest.net and note both the download and upload speeds. This is the variable people tend to focus on, but it is only a part of the story. Having said that, if you can get faster throughput inexpensively, and depending on the kind of remote work you will be doing, it might be worth considering. Learn more... Latency - given you are physically further from your office computers & servers, there is additional delay in communication. There is not a lot IT can do to fix this problem but it varies based on your actual location and the path the data travels to & from your office. Learn more... Congestion - what else is competing for use of your home bandwidth? Eliminate unnecessary uses of the internet (kids streaming Netflix etc.) and see if that improves your performance. Learn more... Routing - how much of your laptop's traffic goes over the VPN? Discuss this with IT as ideally when you are out of the office only your requests for data residing in the office should go over the VPN, not all your web traffic too. Learn more... Application Design - some applications assume immediate, direct connection to their required data. When that is not available, delays ensue. Be careful to advise IT if your poor performance is application specific or general. Learn more... In Detail (click to expand) There are five major attributes (simplified) that determine a network's performance: Throughput You are probably most familiar with this attribute, usually measured in Mbps or Gbps (Mega or Giga bits per second, respectively). It refers to your network’s ability to move data around. The higher the number, the faster the data will travel between devices, and generally, the better the performance. Most corporate LANs have at least 1Gbps networks, however, some client devices might still connect to the server infrastructure at 100Mbps speeds. Your home office might make use of a connection 10 to 50 times slower. Typical DSL or cable speeds are anywhere from 5 Mbps in rural areas to 500Mbps in cities. Also, consider that most residential internet plans are asymmetric. This means the throughput for download (like watching a Netflix movie or browsing the internet) is sometimes ten times higher than the throughput for upload (like sending an email). Therefore even though you may have a connection supporting 50Mbps download, it may only support 5Mbps upload. This can impact VPN performance significantly as you need to both download and upload data to and from the corporate LAN. Latency Latency is a more obscure attribute but, in some ways, more important than throughput. A computer uses lots of transistors to switch electromagnetic waves on and off at great speeds (typically 1 - 4 billion times per second). The binary digits, or bits, your computer uses for communication is carried via these electromagnetic waves. As you might recall from high school physics, light (which is an electromagnetic wave) propagates at the maximum speed of approximately 300 000 km/s in a vacuum. The electromagnetic waves in network cables, optic fibres, and other equipment propagate typically at about 66% the speed of light, or 200 000 km/s. This is important to know because it allows us to determine how long it will take for a packet of data to be sent from your computer to the server and getting a response back, also known as the round trip time (RTT). On a LAN, the physical distance between your computer and the server might be less than 100m. A quick calculation shows that: \begin{align*} RTT &= \frac{100}{200000000}*2 \\ &= 1\mu s \end{align*} So for each packet of data your computer and server exchanges, it will take at least 1µs for the packet to make the round trip. Considering most communication requires a large number of small packets (due to the design of TCP/IP), this quickly accumulates. Simplifying a lot, to transfer a single 2MB document will require approximately 1400 packets of data. Ignoring throughput, latency would introduce an additional 1.4ms of delay. That does not sound like much, and therefore working locally on a LAN generally produces great performance. However, if you are working from home, things change a lot. Your office might now be 200km away (not physically, but the route your packets travel through the internet via the VPN). Consider copying that same file: \begin{align*} RTT &= \frac{200000}{200000000}*2 \\ &= 2000 \mu s \\ &= 2 ms \end{align*} Each packet now takes 2 ms to travel between your computer and the server, 2000 times slower. So that 2MB file will now have an additional 2.8 seconds of latency above, and beyond the time it takes to transfer the actual contents. This is certainly observable and can become highly problematic with certain applications. Congestion When most staff are sent home like we are currently experiencing with COVID-19, a large number of people are concurrently connected via VPN. The corporate network might not be optimized for this amount of traffic, causing a degraded experience for all. Additionally, your partner or family member might be watching Netflix from home while you are trying to do some remote work, reducing the throughput, and possibly affecting your experience. Most home networks do not have traffic shaping enabled - something many corporate offices have implemented that allows certain traffic to have higher priority and not be impacted by someone else downloading a file or watching a movie. Another cause of congestion is WiFi. Most people use WiFi to connect their home computers to the internet due to the convenience it offers. Due to the way WiFi is designed, when a client computer tries to send data to the internet via WiFi, that WiFi Access Point can only talk to that client at that point in time. Any other clients need to wait until the WiFi access point has finished transmitting data. This happens at a low level so it is not always apparent, and some modern WiFi access point support something called MU-MIMO, which allows the access point to talk to more than one device at a time. This becomes a problem when more than one person tries to access the internet, and the signal is not very strong. The effect is to amplify the congestion. Routing There are two ways your VPN could be configured (usually by your IT department) when it comes to routing of traffic. When you connect via VPN, your VPN could: route only the traffic intended for the office network through the VPN and all other traffic goes through your ISP’s modem (the ideal configuration), or it could be configured so that once your VPN is connected, ALL your traffic gets routed through the VPN tunnel. That implies if you are watching a Youtube video or Netflix, all that traffic will flow through the much slower VPN tunnel. The end result is a very poor end-user experience and potential privacy issues since the organization will be able to see your personal traffic patterns. It is therefore highly recommended that your IT department configure your VPN to only route traffic intended for the office through the VPN and all other traffic through your own modem. This will reduce the load on the corporate network, improve your privacy, and general remote working experience. Application Design Applications are designed with certain assumptions in mind. Many applications assume the application is used on a LAN. Have you ever used Office 365 and opened a Word document in your local Word application from the online portal? It takes several times longer to open the document as it has to download the document first. Some other applications experience the same problems. CaseWare, for example, assumes it has a fast LAN connection between itself and the CaseWare file. If you use CaseWare on your local home computer and try to access a CaseWare file on your office network via VPN, the latency will have a dramatic impact on the performance. It is generally best to access the CaseWare application via Remote Desktop over VPN. That way, the CaseWare application is running on a server with Remote Desktop and is local to the network share. We discussed remote desktop in more depth here. Security There are some additional complications when users work remotely via a VPN. Usually, in the traditional corporate setting, IT has full control over all the devices that access network resources and protect them adequately. However, the moment a user connects to the server via VPN, the user’s home computer is connected directly to the corporate network as if the user took their PC and placed it directly on the LAN at the office (this is an oversimplification). That means, if the user’s PC is infected with malware, or is compromised, the attacker/malware now has access to the corporate network as well. It is therefore paramount that users either: patch their home PCs, install anti-virus software, and handle their PC as if it is a corporate PC. Each active VPN connection extends the corporate LAN and therefore increases the risk of exposure. Considering this fact, it is recommended not to leave VPN connected all the time unless your home network is secure. Or Bring your work laptop home and use it to connect to your corporate LAN provided your IT department supports this decision. A little bit of knowledge... If you have further questions about your remote performance, be sure to discuss with your IT department. With your new-found understanding of the variables involved, you should be able to better answer their questions and participate in resolving your concerns.
A little technical knowledge of VPN connections is essential for finance teams to diagnose issues, & involve IT to provide a better remote work experience.
READ MORE
CaseWare Working Papers 2019 - Infrastructure Requirements
- Waldo Nell
- Tips and Tricks
- minute(s)Back in 2015, we gave you 3 tips for how to ensure maximum performance from CaseWare Working Papers. With the recent release of Working Papers 2019, we revisit the topic and update our recommendations. Operating System Working Papers runs on the Microsoft Windows operating system. Windows 8.1, and 10 can both be used. Users with Apple macOS or GNU/Linux cannot run Working Papers natively; you will need to run a Virtual Machine system with one of the supported versions of Microsoft Windows installed in it. Hardware At the date of writing, CaseWare International lists these as the minimum technical requirements of the program: 1 GHz 64-bit (x64) processor; 2 GHz recommended for improved performance Minimum 2 GB of RAM; 8 GB recommended for improved performance Program requires 1 GB free hard drive space. A monitor with 1024 x 768 resolution or higher. Internet access is required during the installation of Working Papers. Additional Components Microsoft Internet Explorer 11.0 or higher, as per the Internet Explorer life cycle. Adobe Acrobat Reader version 10.0 or higher. Microsoft Office 2010 or later, or Microsoft Office 365 (Desktop version, Cloud is not supported) Security and permissions Installation requires local administrative rights to the workstation. Use of Working Papers requires read/write access to the program folder and any folders containing client files. You want maximum performance so these minimum specifications should be taken with a grain of salt as they may provide poor performance in some circumstances (large files, many users, etc.). Key Factors in Better Working Papers Performance The following are key considerations for performance of Working Papers: 1) Location of the Data File Many people work with their CaseWare Working Papers file located on a remote, networked file system. This has many advantages, most importantly the ability to backup and protect the files. However, accessing networked storage is often much, much slower than the hard drive located on your computer. For users working on their files located on a remote file system, the number one thing you can do to improve performance is to move the file on to the computer that is running Working Papers. This can be accomplished either by: Using CaseWare SmartSync. Using a Thin Client solution. Using CaseWare's Sign-Out feature. 2) Processor CaseWare Working Papers is not written to take advantage of multiple cores in your computer's CPU. Frequently, modern processors are designed with many lower-speed cores, and Working Papers does not perform well on these chips. For optimum Working Papers performance, focus on maximizing single-core speed. 3) RAM As a 64-bit application CaseWare Working Papers is able to make use of a large amount of RAM. For this reason, we recommend at least 8 GB of memory. Plan for the future. Considering the low cost of RAM, follow the "More is Better" rule. Hardware specifications for IT If you were asked to choose all new hardware and were just thinking about maximizing CaseWare speed, we would recommend the following. Desktop Configuration: To maximize the performance of large / complex Working Papers files running on the desktop: 64-bit version of Windows 8.1 or Windows 10 Get a current generation i5, i7 or i9 processor with a base clock speed of 3.2 GHz or higher Order 16GB or more of RAM Ask for a SSD (solid-state drive) if the Working Papers data file is going to be on the local computer Thin Client Configuration: If you will be using a thin client approach to providing large / complex Working Papers files to end users, recommendations become a little more complicated. Below are our recommendations for configuration of the Thin-Client server, assuming 20 concurrent CaseWare users: Opt for Windows 2012 R2 Standard or better. Get a Xeon E5-xxxx v4 or newer processor with at least 8 cores, no slower than 3 GHz in a dual processor setup. Order 64GB or more RAM. Ask for an enterprise-grade SSD (solid-state drive) in a RAID array Locate all Working Papers data files on this server directly. Some assumptions about these Thin Client recommendations: As user count increases, systems resources must also increase. No virtualization is anticipated in the above specification. If virtualization is to occur, more RAM may be required. Network interface must be at least Gigabit.
Maximizing CaseWare Working Papers 2019 performance requires both the right hardware and the right configuration.
READ MORE
What is your Bitcoin strategy? Accept / Invest / Avoid
- Waldo Nell
- Tech for Execs
- minute(s)Most people have heard of “Bitcoin”. It was all over the news about a year ago with massive increases in value, which lead some to ponder if Bitcoin would become the currency of the future. Much has happened with Bitcoin in the last year (both positive and negative) and we thought it was time to consider the opportunities of Bitcoin from a Finance Department perspective. As you all likely know, Bitcoin (BTC) is one of many types of cryptocurrency. Other well known cryptocurrencies include Litecoin (LTC) and Ethereum (ETH). Bitcoin is the oldest and most well-known of the cryptocurrencies. It has been around for almost a decade, but only recently began to experience significant increases in value. This increase lead organizations around the globe to wonder if / how they should take advantage. 1) Should your organization accept Bitcoin? For most finance departments, there are 3 reasons that they should not rush to accept Bitcoin (or any digital currency): Low Demand - There is little consumer demand to pay with Bitcoin. This is driven by significant appreciation hopes by those that hold Bitcoin, and high transaction fees. Perceived Legal Risk - Digital currencies are treated as 'commodities' not currencies. This can lead to difficulties with your legal options in the event of theft. Increased Administration - The steps required to accept digital currencies are not the same as accepting traditional currencies. While straightforward, it is one more barrier to moving forward. Further, given price fluctuations, (discussed below) you may want to quickly convert from Bitcoin to cash. This is an additional administrative task that likely further discourages adoption. The result is Bitcoin Acceptance is Low and Getting Lower. I'm not certain at this date what would compel Finance to prioritize its acceptance. 2) Should finance invest in Bitcoin? The answer for nearly all finance departments is holding significant amounts of Bitcoin today is too risky. In just the last few years (from Q3 2016) Bitcoin value began growing exponentially. In Q2 2016 a single bitcoin was valued at approximately $590. Approximately six months later in Q1 2017, the value jumped to $1,400, and in Q4 2017 it peaked at $25,000, before falling to today’s value of $8,200 (CAD). This exponential growth followed by a rapid decline in value highlights the volatility and uncertainty inherent in this modern currency. This volatility likely makes significant holdings by finance in Bitcoin unwise. Some look to Bitcoin futures to bring Bitcoin into the mainstream financial markets. On the prospect of these futures, JP Morgan set their Bitcoin evaluation to "Bullish" as an investment asset. Some analysts believe that the recent advent of these Bitcoin Futures contributed to the major valuation gains that were seen in late 2017. Largely however, this futures market has been disappointing. “Institutional players have stayed on the Bitcoin sidelines, and as long as they are, the futures contracts are likely not to generate substantial amounts of volume.” Craig Pirrong, a finance professor at the University of Houston The future of Bitcoin and its underlying blockchain is definitely something to keep an eye on, but given all the above finance should view Bitcoin like any very risky investment; only invest amounts that you can afford to lose.
12 months ago, Bitcoin was all over the news with meteoric valuation gains. The last year has seen the value drop considerably. What should your finance department do in 2019 with Bitcoin? We answer the two biggest questions.
READ MORE
Windows Security Basics for the Finance Professional
- Waldo Nell
- Tech for Execs
- minute(s)Finance professionals interact with Windows security every day when they provide a username and password to login to their computer/network. Another common interaction occurs when they want to install a new application and are stopped and forced to call IT. Finance professionals can struggle when security prevents them from accomplishing necessary tasks. As we have argued elsewhere, it is useful to have a bit of insight into how Information Technology works so you can either: solve the problem yourself or better communicate with IT. Just as we did with passwords, we also hope to provide finance professionals in small organizations with a better perspective on how their IT department (often outsourced to consultants) ought to leverage File Access Permissions to help mitigate risk. There are many different kinds of security involved on a typical computer helping protect you or your organization from problems (malware infecting your network, unauthorized access to company files, etc.). One such security mechanism is called File Access Permissions. Please note that this topic is actually complex and varies by specific operating system. Thus to keep it brief we will simplify some topics (while being largely accurate) and assume a recent Windows operating system. Access Control Lists and Permissions Each file, folder, and network share have an associated Access Control List (ACL). An ACL is a list of users/groups that have specified access rights to the file/folder/share dictating what they can do with it (called Permissions). Common permissions include Read and Write. ACL entries might look like this: File: Inventory2017.xlsx User Bob: read permission User Mary: read/write permission Assuming no other rights are granted elsewhere, user Bob has permissions to read the file, but only user Mary is allowed to both read the file and make changes to it. If user Bob tries to save the file, he will get an error. In a well-designed network, ACLs are usually defined a bit differently as the above example is quite brittle and hard to maintain. If you had to assign access rights file by file imagine how much work this would be and how easy it would be to make a mistake. Active Directory, Security Groups, Files and Folders In a typical Windows-based network, information on all employees, consultants, and computers are stored in a central directory called Active Directory (AD). As you can see from the image below, it is nothing mysterious. It is a hierarchical tree (like an organizational chart) grouping users and storing details such as your login name, password, email address, etc. The grouping feature is valuable. The idea is simple - a security group is a collection of people that share similar access rights. For instance, HR staff may need access to specific files while Accounting may need access to different files. By creating an HR group and an Accounting group, and adding all HR staff to the HR group and Accounting staff to their group, the IT administrator can now apply security policies based on groups, and not individual users. When an HR employee leaves the company, and a new one is hired, the employee only need be added to the right group and all access permissions will automatically apply to this user. Thus, a well-designed ACL list will leverage these groups and might look like this: Folder: E:\data\human resources Group HR Staff: read permissions Group HR Managers: read/write permissions Two benefits of the above approach: The ACL is applied to the whole folder, ensuring that all files and folders stored in that folder have the same permissions. New files will automatically share the same permissions as the folder. Permissions are assigned to groups. Thus anyone belonging to the HR Staff group to have read-only access to the files and folders, and HR Management to have full access. This reduces the workload on IT and ensures consistency. Network Shares and Conflicting Rights In addition to file and folder level ACL entries, a network share can have a separate set of permissions. In the example above, assume the location is shared on the network under the name "human resources", then the share itself can have an ACL associated with it similar to the following: Share: \\server\data\human resources Group Everyone: read If the ACL on the share were configured as above, nobody would be able to write to that folder. Both the share and the file/folder based ACL need to allow access. They are defined in two different locations and are not related to each other in any way. The best way to think of this is the most restrictive permission set wins. One other permission that you should know of besides Read, and Write is Execute. A program file requires you to have the Execute permission before you may launch it. Attributes Lastly, files and folders may have certain attributes. The Read Only attribute is significant to end users as this flag may override any write permissions set on the file. If a file is marked as Read Only, the user will be unable to modify it even if they have the write permission in the ACL and the shared folder permissions that we discussed above. This flag can usually be removed by the end user as long as the user has write access to the file. Once the flag has been removed, the file can be written to assuming the user has write access based on the ACL set. Security Warnings on Downloaded Files One last issue you may run in to from time to time has to do with files downloaded from the internet. If you are using Windows 7 or later, your computer keeps track of the source of the file and will protect you from files that originated outside of your organization. Windows uses a feature called "Alternative Data Streams" (ADS) to remember which files originated from external network sources. When Windows detects you trying to open one of these files, it will warn you. If your IT department allows you, Windows will ask for your explicit permission to open it. However, your IT department can set up your permissions so that you do not have any choice and are simply blocked from opening these file In conclusion, then - accessing files and folders in Windows is broken down into several layers of protection: 1. File/Folder based ACL assigned to Groups/Users 2. Network Share-based ACL assigned to Groups/Users 3. File-based Attributes such as Read Only 4. Downloaded file based ADS blocking access All four layers need to grant you access before you can work with a file. Also, access might be partial such as read-only or read/write only but not delete. What you should try if you have access issues: To check whether you have access to a file or folder, open File Explorer and navigate to the file/folder in question, right click and select Properties. 1. For file-based ACL rights, go to the Security tab. Click on your name or the AD group you belong to (you may need to ask IT if you do not know this) and check your permissions. 2. For network share-based ACL, locate the mapped network drive, right-click the drive and select Properties. The Security tab will show the ACL for the network share. 3. For file-based Attributes, review the General tab and check if Read-only is ticked. 4. To see if a downloaded file is being blocked (assuming you have write access to the file) right click it, select Properties and then unblock it at the bottom of the Properties window.
Have you been denied access by Windows to a file you need? Here are the Windows Security basics to aid finance in fixing the issue or communicating with IT.
READ MORE
What You NEED To Know About Password Security
- Waldo Nell
- Tech for Execs
- minute(s)As we have discussed previously, a little knowledge about technology & security can go a long way to mitigating risk. That is especially true of one very important and fundamental topic: passwords. There are three basic methods to authenticate yourself to a 3rd party (e.g. a website, an application, or your network): What you have refers to something in your physical possession - a key, a phone, an access card. What you are usually refers to biometrics. Items like your fingerprint, your retina, voiceprint, DNA etc. What you know typically means a password but could also refer to security questions like, "What is your mother's maiden name?" This article considers two important aspects related to the use of passwords in the modern day: 1. How secure is your password? 2. What can you do to improve your digital security? Most corporate systems today still ignore the first two authentication methods. They are protected by the venerable password - a combination of letters, digits and symbols that act as the key to unlock some digital asset. The strength of the password system relies on the assumption that you have chosen some combination of characters that can be easily remembered by you, but not easily guessed by a bad guy trying to access your asset. We use the term “asset” since passwords are used to protect various different things such as online banking and email accounts, social media accounts, the login to your office PC, documents that are password protected, the PIN on your bank card and so on. How secure is your password? Stop and think about the password you used to log in to your own computer today. Most systems have some basic rules like "At least 8 characters long with at least one upper case letter, at least one lower case letter, and at least one number." Your password might look a lot like "passw0rd". (Don't be embarrassed, it's a very common password.) If I wanted to break into your computer, how could I guess that password? There are several ways to approach this problem. Let’s consider just one approach: a brute force attack. The computer program we will use starts based on a specific character set. Let's simplify and assume the character set consists of all lowercase and uppercase letters and digits. So we have a-z, A-Z and 0-9. The program will start its first guess by trying “a” as your password. It will fail and the computer will try “b” and so on until it gets to “z”. It will then try “A”, “B”, …”Z”, “0”, “1”, … “9”. It will then move on to “aa”, “ab”, “ac” and so on until it gets to “99999999”. A simple calculation shows that the program has to guess (26 + 26 + 10)8 = 218,340,105,584,896 combinations of characters to try all of the possible 8-character passwords. 218 trillion combinations! It sounds so large as to be impenetrable. An average office PC from 2005 would take about 170 years to break this password. But on a high-end desktop computer of today, such a password can be cracked within an hour. You may be wondering how bad guys can make so many guesses in an hour without getting detected? The trick is that bad guys rarely directly try to log in as you. Instead, they exploit system vulnerabilities and download large lists of scrambled passwords for thousands/millions of accounts. Once they have the list on their local system, they can try to guess the passwords (using brute force and other methods) at their leisure. Once they have worked out what the passwords are, they then try to log in using the compromised credentials. In real life, bad guys very rarely revert to brute force attacks for longer passwords. Instead, they build up huge word lists consisting of previously cracked passwords from breaches such as Yahoo, LinkedIn, DropBox, Ashley Madison and so on. Since many people reuse passwords across sites, these lists allow bad guys to quickly crack passwords on many sites. What can you do to improve your digital security? Security is hard to get right. The best we can do at this time is to make use of something called defense in depth. The general idea is to not rely on one single measure to protect you but to add multiple layers that in combination dramatically mitigate overall risk. Here are 7 recommendations designed to do just that: Use better passwords - Never use the names of your children, birth dates or anything personal in your passwords. Pick a random 12 character or longer password which mixes lower case, upper case, and digits. Doing this one step would increase the time to crack the password from under an hour for a random 8 character password, to nearly 41,000 YEARS for a 12 character password! Add special characters (%,$,# ) to increase it even more. Be unique, do not share & do not write - Furthermore, do not share your passwords. Never reuse passwords between accounts because cracking one password gives access to many other assets. Do not write your passwords down unless you can store them where they you can guarantee their physical security. Increase the length of your passwords every 3 years - Make sure to revisit your passwords every two to three years. Remember, as long as Moore’s law* holds, passwords that are considered secure today will become weaker in the future. A general rule is to increase the length of your password by 1 character every 3 years Use password management software - Most of us have dozens of accounts, and remembering dozens of different, random passwords is near impossible. Fortunately, there is a solution to this dilemma. Password Managers are applications that you install on your computer/phone, that will remember and manage your passwords for all the various sites you frequent. Your passwords are stored in a secure, encrypted vault which you protect with a single master password (one that is long and hard to guess but easy for you to remember). This one master password protects all the other random passwords. Examples of good password managers are 1Password and LastPass. Two Factor Authentication - Fortunately, many sites & software today allow you to set up something called 2FA (Two Factor Authentication). In addition to providing your password when you try to log on to a service, a short number is sent to your phone by the site as a challenge for you to repeat. By typing in the correct code, you verify what you know (password) as well as what you have (ownership of your phone). It is much harder in general for a bad guy to both know your password and have control over your phone. If your service provides this feature - enable it. It usually costs nothing extra to enable Use fake answers to security questions - Many sites require you to provide one or more security answers in case you lost your password and have to reset it. Providing real answers significantly weakens the security of the account, as it is trivial in today's connected world for a bad guy to scour Facebook / Twitter / Google and find out what your mother's maiden name is or the name of your high school. Best is to use pronounceable but obscure, false answers and store them in your password manager. Restrict remote logins - If practical, have IT prevent remote logins by default for all users. In other words, all users must be in one of your offices to access your systems. As most folks won't want to risk sauntering into your office and sitting at your computer, this improves your security considerably. For those users that do work remotely, have IT limit access for those users to specific authorized locations (IP addresses), or make use of a VPN. *Back in 1975, a guy by the name of Gordon Moore (working at Intel), predicted that the number of transistors in an integrated circuit would double every two years. This translates loosely to a doubling of processing power once every two years, also known as exponential growth. This prediction has held true for four decades, and the implication on our digital security is significant. The computer you buy in two years can crack longer passwords in less time than the one you have today.
Finance officers rely on technology & trust their passwords to provide security for confidential data. These 7 tips ensure your passwords are up to the task.
READ MORE
5 Best Practices for Updating Critical Software
- Waldo Nell
- Tech for Execs
- minute(s)Finance officers, accountants and auditors are expert in a very specific field of knowledge. Generally speaking that does not include great depth related to IT systems and best practices. Yet finance officers, accountants and auditors rely on software (and hardware) to allow them to complete their work, often under very tight deadlines. This article will layout the case for finance involvement in the process and provides 5 key steps for finance. Running with Scissors Almost certainly all of your critical applications (ERP, budget, HR, financial reporting, payroll etc.) get updated from time to time. For many of our clients, finance leave this process entirely in the hands of IT. Often finance does not even know updates have occurred until they come in Monday morning and detect that something looks different. This is a very dangerous approach. Updates are risky. Kind of like running with scissors. You may have done it a hundred times with no problems, but one little misstep and it can be disaster. You may appreciate this if you ever lived through a failed update. To give you a little more control and allow you to better mitigate this risk, we recommend our clients follow a 5 step process to manage their updates. 1) Work with IT - Make sure IT knows that they need to include finance in all critical update processes. This does not mean sending an email to finance when the job is done. It does mean advising finance before anything happens (ideally when updates are first made available) so you can work collaboratively to ensure everything works well. 2) Know the benefits - Ensure you have a clear idea of what the benefit to the organization of the upgrade will be. Amazing new features that you have been dying for? Major bug fixes that will prevent lots of errors? If not, the right answer may be to sit this upgrade out, unless the upgrade is required to improve upon network security for example. 3) Don't upgrade during critical periods - Now that IT is communicating with you when updates are pending, you can advise IT when a good time for finance to do the update is. For example, in the middle of year-end is NOT a good time for just about anything, let alone an update to your Comprehensive Annual Financial Report / financial statement software. Now some processes never stop (payroll for example). For systems that relate to these processes, look for the least-bad time periods to do upgrades. 4) Test first - Develop a testing plan to ensure that all features function acceptably in your environment with your data, before you roll them out to the live system. This will involve several steps: Install in a test environment first or take full backups of all data before installing the new versions, and plan for some downtime if you need to roll back in the latter approach. Get key system users to test all functions that they rely on to ensure they perform as expected. Note - if you used the backup live data option above you will need to notify users that the system will be unavailable while testing is occurring. If the new version passes all testing, take a backup of the live data and update the real / live server or environment. If you did not have a test environment and took backups of data before testing, you have no more work to do other than advising users they can use the application again. 5) Plan for failure - Develop a plan to deal with failed test results. If you followed steps 3 & 4 above, even with failed tests you are safe, no harm done. If you took a backup and then installed over the previous version of the software, now is the time to recover from the backup. Next, communicate the problem with the vendor about the problem. If it takes the vendor a month to resolve your issue, day-to-day operations have not been effected and while not optimal it is survivable. When they give you a new update version, start at Step 1 above and repeat. Following this simple 5 step approach is guaranteed to limit your stress, and avoid the vast majority of problems that haunt the dreams of finance officers everywhere.
You rely on your key applications but updates and upgrades can be risky. Here are 5 steps to mitigate risk and guarantee better outcomes.
READ MORE
Tech for Execs: RDS & XenDesktop
- Waldo Nell
- Tech for Execs
- minute(s)Microsoft RDS: Microsoft provides a feature that is built in to every Windows Server operating system called Remote Desktop Services (known as Terminal Services in Windows Server 2008 and earlier). RDS is Microsoft's technology to support thin client computing, where Windows software and the entire desktopof the computer running RDS, are made accessible to a remote client machine that supports Remote Desktop Protocol (RDP). With RDS, only software user interfaces are transferred to the client system. All input from the client system is transmitted to the server, where software execution takes place. Citrix XenDesktop: A similar but standalone product called XenDesktop (replacing Metaframe) is available from Citrix if you need more advanced functionality than Microsoft's RDS. Some version of this Citrix solution has been around for much longer than RDS. In fact, Microsoft RDS was initially built based on Citrix technology, before they developed their own protocols. The core functionality is largely similar to each other. The primary benefit of Citrix XenDesktop over Microsoft's RDS is better support for audio, video and graphics. Citrix XenDesktop is more powerful and used by organizations requiring more fine grained control over the virtualization experience. See our recommended hardware specifications for your RDS or Citirix server, check out our other article.
Comparison of the two predominant thin client solutions RDS and XenDesktop
READ MORE
Thin Clients versus Fat Clients Explained
- Waldo Nell
- Tech for Execs
- minute(s)Has it ever seemed like the IT folks spends their days making up arcane terms and acronyms? Every time you turn around there is some new three letter acronym (obligatory acronym: "TLA") or concept being discussed. It can be enough to cause the finance officer to let it all go in one ear and out the other. But here are two important concepts you should know something about: fat clients and thin clients. In this Tech for Execs post we are going to explain these terms to ensure that you have a high-level outline of their meaning and the potential value they provide your organization. Why bother with these particular terms? After all, they are hardly new technology and you are likely not hearing about them as often as the latest buzz words like "cloud computing". The reasons to care are three-fold: it is very likely your IT department is using or has recommended "thin client computing" to you (for good reason!). in many circumstances it can significantly reduce IT costs while increasing end-user performance. large CaseWare Working Papers files will almost certainly run MUCH faster in a Thin Client model. The Scenario You work for a government / university / corporation with many employees. It is very likely that most of your users have somewhat similar needs in terms of applications and files. All users would most likely want access to Microsoft Word and Excel, your organizations' line of business application(s), your accounting software if you are in the finance department and so on. Further, you have files that need to be accessed by multiple individuals, perhaps simultaneously, perhaps from many different physical locations. How does your organization deal with these requirements? There are numerous variations and exceptions with more innovations coming all the time. That being said, there are currently two predominant approaches that can be taken to meeting these needs. 1) Fat Client (Traditional) Model - only data resides on servers How it works: In the traditional approach, one simply provides each user with their own computer, install Microsoft Word, Excel, Outlook, etc. on each one and let everyone work independently. You start your computer and arrive at your desktop. Then you commence running your applications (Excel, Word etc.) which are using your computer's resources (processor, memory, storage etc.). If you have no network connection, you can keep on working with all the files that you have saved on the local hard drive. Things get more interesting when your applications need to use data off a server. When data is being accessed off the server, the following occurs: Workstation makes a request to server - "Please send me the data in this large excel file that has been saved on the X drive" Server pushes all the data (and if it is a big spreadsheet it might be a LOT of data) across the network cable to your workstation. You commence working on the spreadsheet; add rows, insert sub-totals, recalculate etc. This work is being done on your local machine but all saving or requests for more data are all going across your network cable. Again, when dealing with a big file, this can be very large quantities of data and therefore slow down the performance of the application. Pros & Cons: On the upside: Other users' activities on their computers have minimal effect on your performance. Minimal reliance on complex, expensive servers in this model. Typical use of servers in this environment is for file storage, checking & maintaining passwords and perhaps hosting a database. Users can be "self-sufficient" - as long as they have a laptop and the working data they need - when they go home on the weekend or if they travel to conference. The downside: Every computer needs to be maintained. They must be updated for security and bug fixes as well as the updates of each and every software program Each computer's hardware must be maintained at levels acceptable for the software applications they are going to be using. As the software programs require more resources, you must upgrade each and every computer that will use that program. If any data is stored on the local computer, it then needs to be backed up to protect the organization's data. If a new application is needed, it is likely that it has to be installed on multiple computers. Because each workstation brings all the data across the network cable to be worked on locally for each user, there is often a tremendous amount of network traffic. In a modern network this may not be an issue but if there are very large quantities of data or there are multiple physical locations that must communicate, the network bandwidth capacity may not allow for quick transmission of all the data required. Because of points 1 & 2 above the organization is committed to continuous investment in each individual workstation to ensure the hardware is capable of running the current version of the software. This is where the second approach comes to the rescue. 2) Thin Client (Virtual) Model - data & applications on servers How it works: In this model, instead of using multiple powerful workstations & laptops to run the application, all that work is done by one or more servers. Instead, end users are given terminals, sometimes called "dumb-terminals". Essentially all they do is present the user with a picture of a computer interface on their monitor and accept the users' input (mouse clicks and keyboard typing), sending it along to the the server for processing. All data, programs, and processing remains on the server. In many ways this model is a return to the mainframe model of the 1950s. There are two primary technologies to discuss that drive this model, one provided by Microsoft ( "Remote Desktop Server" or RDS) & another provided by Citrix (XenDesktop). If you want/need to participate in a conversation with IT more fully about these technologies, you can find the details here. Pros & Cons: The benefits of Thin Client computing are many: "Dumb Terminals" need little to no maintenance. Only the servers need to be upgraded & maintained. Only the server's hardware must be maintained at levels acceptable for the software applications and the number of users (more users = more horsepower required for servers). No data can be stored on the local computer. Thus data is better controlled and backed-up. If a new application is needed, it must only be installed once - on the server. Thin Client software is long established and well know in the IT community. Network traffic is made more predictable / even. Potentially depending on the data being presented to the remote user, network traffic may even be reduced. Thus even with minimal network bandwidth capacity end users may not experience any slowness. Especially when remote access across the Internet / VPN is required, this may be the biggest benefit of all. Because of points 1, 2, 3, & 4 the overall result is often much lower total cost of ownership for the organization versus the traditional model. But there are some concerns & drawbacks: If users need to deal with a lot of video & audio (especially creation & editing) the centralization of thin client computing can tax the servers tremendously resulting in poor performance. The servers (their software and setup) are more complex & expensive than the servers are in the traditional model. The downside of centralization is a single point of failure. Should your application server fail, all your users will be affected. To mitigate this risk, IT must build in redundancy which offsets some (not all) of the savings. Users might be tied to a terminal and not able to move around the office or off the network to perform any of their work. To be clear, your IT department can provide remote access to the RDS / Citrix server, but they will often be very concerned about increasing security risks if they do so. Impact on CaseWare: Very large CaseWare Working Papers files used in a traditional model (data file on the server and application on your workstation) can be very slow, especially when network performance issues get in the way. There are some recommendations to maximize your CaseWare performance in this model but you might benefit by changing models. Moving to a Thin Client model with CaseWare can yield major performance improvements. To note though, due to the internal architecture of CaseWare Working Papers, the closer the Working Paper file is to the actual application, the better the performance. And by close we mean not physical distance, but close in access time. The best performance will be obtained if the Working Papers file is stored on the same RDS/Citrix server which hosts the CaseWare Working Papers application. If your organization is not currently using one of these thin client solutions, and your only concern is CaseWare performance, we also strongly suggest considering CaseWare SmartSync. To get more tech tips for executives, be sure to sign up for our blog In Black & White.
This Tech for Execs explains Thin Client computing & enables finance professionals to understand the benefits both with CaseWare and generally
READ MORE
Tech for Execs: Ignorance is not bliss
- Waldo Nell
- Tech for Execs
- minute(s)F.H. Black & Company Inc. works with governments, universities, large companies and accounting firms. The professionals we work with are typically auditors, finance officers, accountants and CFOs. They have great expertise in accounting & management and tremendous experience in their industries. They may even spend some time to refine their skill with the specific applications that they use (Excel, CaseWare, etc). Often though, they have minimal expertise with the general computer technology infrastructure that surrounds them. Why should finance & budget officers want to become more sophisticated with technology? Increase your professional performance: Improving the general technological literacy of finance professionals can yield a tremendous increase in their overall efficiency and effectiveness. Improve your ability to assess & mitigate risk: Having at least a basic level of knowledge is essential to avoid fraud, data loss, identity theft and a myriad of other modern-day hazards. Not just personally, but as a critical part of the Internal Control process, finance's facility with technology impacts the entire organization's safety. FHB's series of articles on Technology for Executives ("Tech for Execs") will enable you to do just that - improve your fluency and ability to use technology to facilitate increasing your efficiency, effectiveness & reliability. An important starting point for this series of articles is to consider: Why it is that so many finance and budget professionals are less than perfectly conversant with computer technology? Why is the need for expertise in computer technology different than that of any other tool you rely on? A very recent addition to the world The explosion of the computer age over the last 40 years is unlike anything the human species has ever encountered before. No other technology has grown to have such a overwhelming influence on so many people in so short a time. Due to this very rapid adoption, there is a segment of the world's population that is relatively inexperienced using this new technology. That is understandable - most people aged 40 and over didn't grow up with computers, and if you are in your 50s perhaps much of your early professional career was without significant computer usage. When you combine the short time-line with the magnitude of the change, computer technology aptitude is not necessarily an easy skill to develop. A tool unlike others Regardless of age, most people use the internet like we use appliances, or our cars. We use them to provide a service to us, and don't care about its inner workings. Why would you want to understand the Otto cycle in your car? Or how the magnetron works in your microwave oven? Or how the escapement mechanism keeps your mechanical watch ticking? As long as it perform its function, we just don't care. Nor should we. You do not need to understand these things to drive your car or heat a cup of tea. If it breaks, you send it in to be repaired or you replace it. Not understanding the inner workings does not affect us negatively. With computer technology there is a superficial similarity. You do not need to understand nomenclature such as CPU, RAM, HTTPS or CHAP to be able to send a tweet, post a Facebook update or call a friend over your internet phone. But, this is where the analogy stops. Most of the technology you use today relies on connectivity. Sending email, browsing the Internet, accessing Facebook or LinkedIn means that your computer is connected to billions of other computers. This connectivity, uniquely among the tools and technologies you use, exposes you to massive risk. The difference between securing a thing like your house and securing your computer or smartphone is that for someone to break into your home, they have to be physically standing in front of your house. There are only a small number of people geographically close to you, severely limiting how many bad guys that could actually attempt to break in. With your internet connected computer, you suddenly have more than a billion people that can attack you. However, the threat gets even worse: most people do not have the knowledge or skill to even understand that they are under attack. Or even has their computer system been entirely compromised. Everyone understands how a burglar could break in to their house. He can come in through the window, pick your lock, use the roof etc. But almost nobody (statistically speaking) knows how bad guys get in to your computer. It is black magic, mysterious and spooky. This does not mean everyone should be signing up for computer security courses. Or that you even need to know what the acronyms I mentioned earlier stand for. Just like you do not need to know what the Otto cycle is to be able to recognize the danger in crossing a road filled with cars and cross it safely, we will be teaching you how to better prepare yourself for this new era of being interconnected to each other. Naturally we cannot possibly cover all aspects as this is indeed a very complex landscape requiring long, intensive study to truly master. We will merely be applying the Pareto principle in trying to teach you 20% of what you need to know to make you 80% safer.
Tech for Execs discusses why accountants, & finance professionals are likely not technology experts and why they should increase their expertise
READ MORE