This is a bit of cheat, because I already went over SQL Server on Linux, but this I thought deserved special notice. Did you know you can run your SSIS packages from your Linux box with SQL Server now? You can.
Just pop in this little one-liner and you’re off to the races*.
$ dtexec /F \<package name \> /DE <protection password>
If push came to shove you could probably put that on a cron job should you want. You still need a Windows server to create and maintain the packages, but you can run them locally from the box if you’re trying to keep the family together for the kids.
This is a feature that kinda existed previously, but it was just called “R” Services. The big thing of note is that it now supports Python and the associated libraries. See previous post in this series to catch my sarcasm about Python not being included in the first place.
Thing to note about Machine Learning Services is that it’s not supported in-database on Linux. You can still do things like native scoring (PREDICT), but that’s just about the long and the short of it. Microsoft is making noises like they’re going to address this in the somewhat-near future.
This is a pretty neat feature that I hadn’t thought about before. What if you had several servers that potentially COULD handle an SSIS workload (in an HA scenario or something), but you didn’t want to always target the same instance. You know, spread the love around.
SQL Server 2017 allows you to set up a master on your main instance and then workers on the servers you want to be able to scale-out to. After a bit of setup on your worker machines you can then either target machines with specific packages or let SSIS decide. Check out this walkthrough for more.
This is less a new feature, and more of a major revamp to something that already existed. The new Reporting Services default Web Portal is a lot snazzier and has some new things. You can customize branding the instance and even develop KPIs that are contextual to the folder you are currently viewing.
Master Data Services has had a rough life. Beginning life as far back as SQL Server 2008R2, this has been the red-headed stepchild of the SQL Server offerings. It started out, in my humble opinion, barely usable, unnecessarily complex and just feature-poor. I’m not alone in this opinion.
Subsequent releases have helped it, but even after several versions it still was pretty weak and only useful for very specific cases. MDS only really came into its own in 2016, but with some performance limitations.
Edging ever closer to a more perfect product, SQL Server 2017 features some much needed performance optimization allowing it to stage millions of rows in a reasonable amount of time. It was painfully slow previously with only a few hundred thousand records.
Lastly, they fixed the slow UI movement when doing things like expanding folders on certain pages.
Not any this time, unless you want to talk about SSAS object-level Security or DAX finally getting an IN operator. Those seem pretty useful.
That’s it for 2017. There are, of course, many many more changes and new features in SQL Server 2017, but I think 10 or so is good enough to give you a taste. There are changes all across the product and I encourage you to look them over yourself.
It’s finally here! A few days ago as of this writing, SQL Server 2017 was released for Windows, Ubuntu, RedHat and Docker. There are a lot of new things in SQL Server 2017 from Python support in Machine Learning* to better CLR security. But I thought I’d narrow down the list to changes that I’m most interested in.
Installation is a cinch, especially if you’re a Linux or Unix person. Curl the GPG keys, add the repository, apt-get (if you’re on a real Distro) the installer and run the setup. It’s really that easy.
All the main features are there, the engine (of course), agent, Full-text search, DB Mail, AD authentication, SQL command-line tools etc. Pretty much everything you need to get going on Linux even if you’re in a Windows-dominated environment.
2. It’s Like A Car That Repairs Itself
So this is a big one. With automatic tuning on, SQL Server can detect performance problems, recommend solutions and automatically fix (some) problems. There are two flavors of this, one in SQL Server 2017 and one in Azure; I’ll be talking about the one in SQL 2017 here.
Automatic plan choice correction is the main feature and it checks whether a plan has regressed in performance. If the feature is so enabled, it reverts to the old plan. As you can see below, a new plan was chosen (4) but it didn’t do so well. SQL 2017 reverted to the old plan (3) automatically and got most of the performance back.
I’m sure that Microsoft will be expanding this feature in the future and we can expect to see more from this. Azure already has automatic Index tuning in place, so we’ll probably see that in the On-Prem version eventually.
3. Indexes That Start, Stop And Then Start Again
This is a feature I didn’t know I needed. Basically, it allows an online index rebuild that has stopped or failed (say, it ran out of disk space) to be resumed. The index build fails, you fix whatever made it fail, and then resume the rebuild, picking up from where it left off. The rebuild will be in the ‘PAUSED’ state until you’re ready to RESUME or ABORT it.
Code: -- Start a Resumable index rebuild ALTER INDEX [NCIX_SomeTable_SomeColumn] on [dbo].[SomeTable] REBUILD WITH (ONLINE=ON, RESUMABLE=ON, MAX_DURATION=60)
-- PAUSE the rebuild: ALTER INDEX [NCIX_SomeTable_SomeColumn] on [dbo].[SomeTable] PAUSE
/* If you'd like to resume either after a failure or because we paused it. This syntax will also cause the resume to wait 5 minutes and then kill all blockers if there are any. */ ALTER INDEX [NCIX_SomeTable_SomeColumn] on [dbo].[SomeTable] RESUME WITH (MAX_DURATION= 60 MINUTES, WAIT_AT_LOW_PRIORITY (MAX_DURATION=5, ABORT_AFTER_WAIT=BLOCKERS))
/*Or if you just want to stop the whole thing because you hate unfragmented indexes */ ALTER INDEX [NCIX_SomeTable_SomeColumn] on [dbo].[SomeTable] ABORT
This is also great if you want to pause a rebuild because it’s interfering with some process. You can PAUSE the rebuild, wait for the transaction(s) to be done and RESUME. Pretty neat.
4. Gettin’ TRIM
This one is kind of minor, but it excited me a lot because it was one of the first things I noticed as a DBA that made me say ‘Why don’t they have a function that does that?’ (note the single quotes). TRIM will trim a string down based on the parameters you provide.
If you provide no parameters it will just cut off all the spaces from both sides. It’s equivalent to writing RTRIM(LTRIM(‘ SomeString ‘))
Code: SELECT TRIM( '.! ' FROM ' SomeString !!!') AS TrimmedString;
5. Selecting Into The Right Groups
Another small important change. In previous versions of SQL Server, you could not SELECT INTO a specific Filegroup when creating a new table. Now you can, and it uses the familiar ON syntax to do it.
Code: SELECT [SomeColumn] INTO [dbo].[SomeNewTable] ON [FileGroup] from [dbo].[SomeOriginalTable];
Also, you can now SELECT INTO to import data from Polybase. You know, if you’re into that sort of thing.
Here are some things that caught my eye, but didn’t really need a whole section explaining them. Still good stuff, though.
Query Store Can Now Wait Like Everybody Else
Query store was a great feature introduced in SQL Server 2016. Now they’ve added the ability to capture wait stats as well. This is going to be useful when trying to correlate badly performing queries and plans with SQL Server’s various waits.
See the sys.query_store_wait_stats system table in your preferred database for the deets. Obviously, you’ll need to turn on Query Store first.
Con The Cat With Strings
Just as minor as TRIM in some people’s books, but this is a great function for CONCAT_WS’ing (it’s not Concatenating, right? That’s a different function) strings with a common separator, ignoring NULLs.
Code: SELECT CONCAT_WS('-','2017', '09', NULL, '22') AS SomeDate;
Get To Know Your Host
sys.dm_os_host_info – This system table returns data for both Windows and Linux. Nothing else, just thought that was neat.
Get Your Model Serviced
Not something I’m jumping for joy about, but it is a good change of pace (I think). No more Service Packs, only Zuul… er… just Cumulative Updates. Check out my article on it if you want to know more about all the Service Model changes.
And Lots Of Other Things
Of course, there’s hundreds other things that are in SQL Server 2017 and related features. I didn’t even touch on the SSIS, SSAS, SSRS, MDS or ML stuff. Check out the shortened list here, broken down by category. Exciting new toys!
* Seriously, how do you release a Data Science platform and not include Python? That’s like releasing a motorcycle with only the rear tire. Yes, you can technically use it, but you you’re limiting yourself to a customer base with a very selective skill-set.
What I gushed over a few posts ago has finally happened! SQL Server has a come to Linux (sort of). The database engine is now available as CTP1 and you can get it by adding the repository and running the setup script.
You can follow the walk through for your favorite flavor of Linux, so I won’t repeat that here. it’s really very simple, just a matter of pointing to the correct repository and then apt-get install (Ubuntu). It comes with a setup script that pretty much does all the heavy lifting for you. Keep in mind that this is just for preview so there’s not a lot of options and it sticks everything in a single set of directories (logs/data/tempdb).
I had a small problem when I did the install, but it turned out I just needed to update a few packages. In the event you’re not a Linux person, here’s the easiest way to fix this:
$ sudo apt-get update
$ sudo apt-get upgrade
There’s a lot of stuff to dig into in this release, and as newer versions come out I’ll get more in-depth, but I just wanted to make a quick post about what I did in my first thirty minutes.
After the install, I connected via SQLCMD, as there is no SSMS in Linux yet, using the sa and sa password set in the install. I then created a table, dropping a single row into it and then selecting. Not terribly complex stuff.
I took care to try different cases, adding and neglecting brackets ‘’ and semicolons. It responded how I expected it to react if I was on a Windows system, which is very reassuring. It’s nice that my T-SQL skills translate seamlessly to the Linux environment, at least internally to SQL Server.
Next, I put my box ‘U64’ on the network and lo-and-behold I was able to remote into it by its Linux hostname from SSMS 2016 on a Windows machine. No additional setup was required. Microsoft appears to be taking this integration of the Linux and Windows environments seriously.
I then created a SQL login for myself and logged in that way. No issues.
This is of course just for CTP1, so a lot of these items will probably show up later. I mean, SQL Server without the SQL Server Agent? That doesn’t even make sense (I’m looking at you Express Edition). There is sort of cascade effect as other items like Maintenance Plans and such that rely on these missing features also being MIA.
Also, larger items like Availability groups will also be absent because there’s no Linux analogue for them currently. From what the SQL Server team said in their AMA on reddit they’re toying around with RedHat clustering as a replacement for this in the Linux environment.
The last thing I did before the end of my 30 minutes was to look at the version. As you may or may not know, the Linux version is based on SQL Server vNext, which (as the name implies) is the NEXT version of SQL Server. There was some talk about it being a port of SQL Server 2016, which does not appear to be the case.
Microsoft SQL Server vNext (CTP1) - 184.108.40.206 (X64)
Nov 1 2016 23:24:39
Copyright (c) Microsoft Corporation
on Linux (Ubuntu 16.04.1 LTS)
And that’s it! As mentioned before I’ll be doing deeper dives into this as time goes on, at the very least with each CTP. But I have to say I’m happy with the results so far. Everything (that was available) worked as I expected it to work. Nice work MS!
While I’m putting together my big update on Inventory Manager, I thought I’d take some time to throw confetti into the air. There may be some excited clapping as well. I warned you.
I largely see myself as platform-agnostic. While I think that certain companies do individual products well, I also believe it’s fair to say that none of them do everything well. I use Android phones and Apple tablets, Linux for home (mostly) and Windows at work. Heck, I’ve got a Roku and a Chromecast because they both do things that the other doesn’t. I’m all over the map, but all over the map is a great place to be, especially in the tech industry now.
Despite all of this, I have to admit I am partial to Free Open-Source Software (FOSS). Give me a choice between Ubuntu and Windows, and all other things being equal, I’ll choose the Debian-based option. I’ll admit my biases.
So, when MS started moving in this direction I was happy. I wanted to see this trend continue, and boy has it. First of all…
When Microsoft announced that .Net was going open-source, I was cautiously optimistic. I’m not a big .Net coder, but I could see the benefit and was hopeful that MS would continue down this path. This lead to some cool things that I thought I’d never see in a million years, like .Net running on Redhat.
There’s understandably some cynicism about Microsoft’s true intentions, as well as their long term goals, but this is the cross-over that I’ve been wanting to happen for a while. Blending the strengths of RHEL with .NET on top is a great start. If the .NET development platform can be ported, why not parts of the Windows Management Framework? We could even one day see…
I didn’t always like Powershell, in fact prior to Powershell 3, I just referred to it as PowerHell. Since 4.0, however, it’s no secret that I’m a fan; one look at my github will tell you that. I like its logical approach to (most) things and that it works for simple scripts quite easily, while being a powerhouse (no pun intended) behind the scenes.
This shell coming to OSX and Linux will be a boon for both systems. While I am, and will probably always be, a bash scripting guy, Powershell in Windows just makes everything so gosh-darn easy. If I could whip up a PS1 script with a few imported modules and attach it to a cron job with ease, then I think everybody wins, mostly me. But, if I decide that I want to use bash instead, that’s okay because…
This isn’t a one way transition. Microsoft is making a trade, bringing one of the most widely used shells to Windows. This not only makes scripts more portable, but also knowledge.
Have some ultra-fast Linux bash script that works wonders? Super, you now have it Windows, too. Wrote a script to do some directory work in Powershell? Great, you now know how to do it in Linux.
There are very few downsides to this, other than the obvious security issues and that it isn’t truly a stand-alone shell (it’s part of Ubuntu on Windows). In any case, it allows interoperability between software from different systems. This is great now that…
This isn’t technically going open source, as it will run inside a container, but the idea that this will now be possible and supported is like something out of my greatest dreams.
I have a maybe-controversial opinion that SQL Server is the best relational database system out there. For all its faults, I’d rather use SQL Server 2005 SP1 than Oracle 12c. Just the way I feel, and for reasons I won’t go into here. I hope the things I like about SQL Server translate to the Linux environment.
The fact that Ubuntu is supporting this with Microsoft is great. I can’t wait to use my favorite OS with my favorite database engine on the same system.
There are other items I’ve glossed over, but these are the big ones to me. Soon, we will be able to run SQL Server on Ubuntu Linux with cron jobs executing Powershell for a .Net application that resides on an RHEL box. *excited clapping* (I warned you.)
In the wake of the NSA/British Intelligence scandal, and the continuing surveillance of Internet Service Providers and websites such as Google, the interest in personal privacy has grown. While this article won’t be a long-form argument for personal privacy (mostly because I don’t think I need to do so), there are a few relatively easy things you can do to keep your online persona under your control and there’s good reason for it.
The oft-repeated adage is that “you shouldn’t put anything on the internet that you want to keep private.” While this sounds logical and simple, it usually isn’t. So much of what we do is tied up in the Internet. If you’ve ever bought anything online, done a web search or even paid a bill via a website, then that information is stored somewhere and is accessible to someone. And, while many make the argument that they have nothing to hide, the truth is that you probably do.
Not all of us have a murder or mob ties to cover up necessarily, but almost everyone has a debit/credit card information that we don’t want out there, or a few less-than-flattering pictures. On a different note, just because what you’re doing isn’t illegal, it doesn’t mean that you want to broadcast it to the world. In fact, you might be breaking the law without even knowing it.
That said, what do we do about it? Is there any way to hide everything we do on the net from everyone? The answer is: not really, but there are things you can do to minimize the amount of data you drop into the internet, and at best make it anonymous (not directly tied to you).
I’m going to outline a few steps you can take if you’re concerned about your privacy that will give you the most return for time invested. Much like my recent post on internet security, this is a short list of simple to do things that give you the greatest “bang for your buck”.
Your Browsing and Searching
The browser is where most websites will get the information they collect on you. Most of it is pretty general, the OS/browser you’re using, how long you were on the site, and things like that. However, sites that are more clandestine or that you use frequently can collect a large amount of information about you.
Take Google for example. This is a website that we know collects data on its users and we know has been syphoned by the NSA (National Security Agency). When you log into any of their services, or do any searches from the site, all that information is stored and linked together. This data, over time, can build a pretty accurate picture of you based on your search and browsing habits. It’s not even necessary for you to give Google a name for them to find out who you are, as this can be mined from the data you give them. If you’re constantly going to a few sites and logging in, and any one of them has your name anywhere on it, then that can be linked back to your data.
The data doesn’t even necessarily have to come from you. The recent Facebook breach allowed people to access the contact lists of people they didn’t even know and download them. If you are in the contact lists of people who have Facebook and they’ve uploaded their contact lists to Facebook, then you’re on the site… even if you’re not on the site. Your information can be compromised if you’ve never had an account.
There’s not too much that you can do about the Facebook debacle, short of making sure that no one who has you as a contact uploads their data to the site. Though, there are a few things that you can do in general to reduce your footprint online.
As mentioned in my Internet security post, set your browser to hide you online. Most major browsers now have a “Do Not Track” option in them that will tell sites that you want to opt out of being watched. Most “good” sites will honour this and not track you. However, a few will still do so.
To combat this, we need to take the browser work a bit further. Having the browser automatically use incognito mode (Chromium/Chrome) will greatly reduce the amount of tracking data that the browser can pass on. However, incognito mode can cause problems with certain websites, so, you can do like I do and have the browser clear everything every time you close it. Firefox has this option, and while it’s not as robust as the incognito/stealth mode, it does make browsing significantly easier. Every time I close the browser and reopen it, it’s like I’ve just installed the browser; websites have nothing to track because as far as they can see, I’ve never been to any websites.
Now, if you don’t want to make any changes to your browser or you want another layer of security, you can change the search engine that you use. While the biggest ones such as Yahoo! and Bing also collect your data and share it, there are ones that are built specifically with privacy in mind. The main engine I use to do all of my searching is DuckDuckGo which keeps no logs on its users and sets up an encrypted connection (via SSL) between you and the search engine so nothing can be intercepted.*
Using the above techniques you can keep your search history private, or at the very least separate you from your searches.
Your Connection and Software
This is all well and good, but it doesn’t protect you against someone snooping on your connection to the internet. Even though you’re anonymous to the search engine, you’re not so anonymous someone who’s watching you browse, such as your ISP or someone sniffing packets in a cafe. To secure that, we’re going to need to hide your internet connection.
The easiest (cheapest) way to do this is to always try the https:// version of a website before the http:// (note the “s” for secure). This little change will create a secure connection between you and the website, making your traffic unintelligible to a malicious viewer. Not all sites support this, but some of the big ones do. The site you’re going to will still be visible, but the contents will not. Keep in mind that this is the “free” option and is very hit-or-miss.
Private Internet Access’s “Why use a VPN?” video.
Another option, which is the route I would recommend, is to push all your data through an encrypted VPN (Virtual Private Network). There are a lot of them out there, depending on how much privacy you want and what price you’re willing to pay for it. Some offer a full range of services including news access as well as other benefits like VyperVPN (will run you about $20/month) or simple unlogged access like PrivateInternetAccess **(about $4/month). In both cases, the system creates an SSL (Secure Socket Layer) VPN between you and their servers and then pipes you with an anonymous IP out to the internet.
Someone spying on you would only see a mass of garbled data being sent to some server somewhere where it disappears. Any website or person on the internet would see your data coming from a block of IPs owned by a VPN company. There’s virtually no way to connect the two (no pun intended).
If you’re only concerned about eaves-dropping when you’re out and about, you can also use something like Hamachi Log Me In to create an SSL VPN between a mobile device/another computer and a home machine. Keep in mind that with this system, anyone watching your home machine will be able to see the data unencrypted. The secure connection is only between your remote device and the home computer.
Lastly, the software you use on the internet that isn’t your browser, such as Skype or Yahoo! messenger is also targetable. While there’s only a little you can do to secure these, you can do a few things. First of all, check your privacy settings and make sure you have everything locked down. Most of these services have a small but useful section in the options called “Privacy”. Also, make sure your chat history isn’t being saved. You can turn this off in every messenger. While it doesn’t guarantee that the data isn’t being stored elsewhere, it does reduce the lifetime of the data and the chance that it will be recovered.
Am I Private Yet?
So the question remains as to what affect will all this have? The truth is that we don’t know entirely. Depending on who’s targeting you and why, the things listed above, if implemented properly, can range from significant annoyance to complete blackout. However, if you implement no privacy measures you can rest assured that some, if not all, of your data is being collected and catalogued.
Not all of these may be for you. But a smattering of them in some form or another will help, especially the VPN services, and I recommend you at the very least lock down your browser as mentioned above and in my previous internet security post. Even if you think you have nothing to hide, you may find out in the worst way possible, that yes, you did.
* You can get the add-on/plugin for your browser of choice as well so it’s automatically in the upper right search box on your browser.
** If you’re just looking for privacy and nothing else, this is the way to go.
As I had mentioned on a placeholder post (since deleted) I have been in Vietnam for the past few weeks on an academic trip. This blog is generally geared towards technology, so I won’t be focusing on my trip per se, but on the technology I encountered there. There are a few things of interesting note to me and perhaps others that are part of every day life in Vietnam. I decided to combine these all into this one post.
Keep in mind that this is from an American’s point of view, so some of this stuff may be, and is, used all over the world, but this was my first encounter with it in mass. The air conditioning systems mentioned later are a good example of this minutiae that I find interesting, but is probably old-hat for people who’ve always used this stuff.
The first thing that struck me when I arrived was the cabling over the streets. While Vietnam is generally well “wired” in the sense that basic broadband was available in the cities I went to, the majority of it seems to be above ground. Cabling that would normally be hidden beneath the streets was up on posts, creating some very haphazard-looking displays close to that of spider webbing.
I actually saw some installers putting in some new wiring, but I was unable to catch any video of it. It mostly involved threading the wiring around the post and to its destination. It wasn’t clear to me how they were differentiating different cables from each other, or how they were avoiding cross-talk and interference, or if they were even concerned about that.
Speaking of being wired, the city of Da Nang was in the process of implementing a city-wide WiFi service. Even though it wasn’t officially available (it should be by the time this post hits) I was able to use it almost everywhere in the city with varying levels of success. It was about what you’d expect from a public wireless service. Useful, but not as robust as a privately-owned system.
3G service was fairly ubiquitous, and the VNMobile Blackberry that I had been given had signal just about everywhere I went. I did not have the ability to test data transfer speeds, but 3-4 bars was present in most locations, and cities were generally solid throughout. Mobile devices themselves were everywhere, just as in any city anywhere in the world, though I saw much fewer tablets than state-side. I’m not sure the reason for this, but I imagine transportation might be part of it. Most Vietnamese ride motorbikes so maybe finding a place for a device of that size is difficult. I can only speculate.
Moving on to more minutiae, the traffic light systems are quite similar to what you’ll find in just about every country, with the addition of a timer. Especially in the larger cities, lights had timers that would tell you how many seconds until it would change. It was my understanding that this was prevent people from preempting the lights and causing accidents, as well letting motorists check their mobile devices or do other things at a stop light without holding up traffic when it suddenly went green.
Also, while this might be odd to point out, the air conditioners, both in private residences I visited as well as in most hotels, were these single-room setups. They were mostly operated by a remote, and as I found out later, called “ductless” air conditioners. Here in the United States, A/C units are usually large affairs (especially in the case of central heating and air), even the small units, and have to be planted on the outside of a residence. The ones I encountered in Vietnam used less power, could be placed anywhere in a building and were hyper-efficient. However, they had the drawback of not quite offering quite the cooling power of some of the Western ones that I’m accustomed to.
Lastly, along the same line as the air conditioners, the most common type of water heater was not a tank water heater as is common in the States. Almost every place I went used in-line tankless water heaters. These work by heating water as it’s used rather than heating and holding it until use. These can be set up to heat with electricity (the most common I saw), natural gas or even propane. The only problem I had with these was that they sometimes didn’t get hot enough or took a long time to get “warmed up”. Again, very efficient but not as robust as the tank ones I’m used to using in the US.
I did a lot more on this trip than look at water heaters and street lights, but I thought that these little tidbits were the best suited for this blog. I find the differences in the technology that people use on a daily basis the most interesting, as all “good” technology intertwines itself seamlessly into our lives.
So, let’s say the scary stuff first. There is no system, software or hardware that is un-hackable. There is no 100% infallible way to keep your accounts from being compromised. Even if there was such a thing, there’s no way to keep all of the sensitive data you have on the internet inside of it. Work, social media sites and even your own personal devices are all points of entry for a malicious user to wreak havok on your life should they target you.
Therein lies the trick; make sure you’re not a target, or at the very least, not a tempting one. Make the wall so high that even though it is scalable, it’s just too much work or not worth the payoff/risk. There are a few simple things I do to reach these ends, and I’m going to enumerate them below in the hopes that it will help someone secure themselves a little better.
Point of Note: This is not an all-encompassing list of everything that can be done. It does not hit every single base or to show you how to use every available tool to its maximum potential. I’ve focused specifically on the systems that will yield the greatest improvement in personal security for time invested. There are much more thorough listings and how-to’s out there on how to really lock your stuff down. These below are some simple, quick things you can do to make yourself a little safer in the digital realm.
Every good article on internet security starts here, and for good reason. People are very bad at choosing good passwords and good password habits. If it’s been said once, it’s been said a thousand times: Your accounts are only as secure as the passwords that guard them.
First of all, choosing a good password is a must as they are the keys to all of your data. They need to be LUC-keys: Long, Unique and Complex.
LONG – At the very minimum 8 characters, and the best are even longer (some of my more important accounts are up to 16 characters). The longer the password is, even weak ones, the harder they are to guess or crack.
UNIQUE – Each account should have its own password that is unique to that account. If you use the same password on multiple important accounts, then a malicious hacker would only need to break one and they’d have access to all of your accounts. Suddenly, a minor email breach could open up your bank or Facebook accounts if the same password is shared.
COMPLEX – Choose complex passwords that contain numbers, letters and symbols if allowed. Try to avoid common words, names or dates, especially if they could be easily guessed. You, kin or a significant other’s birthdays and names are all information that is easily obtainable through public records.
Passwords that follow these rules are very difficult to crack, and even if they are, you’ve only lost control of a single account. You’re creating the most amount of work for the malicious party with the least amount of payoff.
A password is always better when it’s coupled with good account security. You can’t control everything, though. If a site gets hacked and your password gets spilled, then there’s not a lot you as a lowly account holder can do about it. However, most sites offer a plethora privacy and security options that can lock down your account even further.
Every site has privacy controls of some kind, even if it’s a super-simple site like a bulletin board or forum. These can be used to mask information about your account that could be used maliciously, or in some cases even hide yourself entirely from people that you don’t know. Sometimes this can be daunting, like in the case of Facebook’s privacy system, but the payoff in security can be huge. It’s worth your time to check them out at the very least and see if there’s anything useful in there.
Not all sites offer this, but the big ones like Google, Twitter and hopefully your bank do. Two-Factor (or Step) authentication is just exactly what it sounds like. It requires, upon logging in (and sometimes only under certain conditions) for the account holder to verify that the login is legit by going through a second authentication step. Usually this is something like entering a code that is texted to a phone number previously set on the account. Even if your account is already compromised, you be notified that someone accessed it.
A lot of information can be gathered from the electronics that you use to access your accounts. Mobile devices can be stolen, remotely accessed or even “borrowed” and used for malicious purposes. Your home computer can be infected or stolen as well. It’s important to know the features of your operating system and what you can do to lock it down further.
First, let your browser store your passwords. I know it sounds counter-intuitive, but modern browsers have sophisticated mechanisms for hashing (hiding) passwords and keeping them safe. Also, they’re generally impervious to keyloggers (programs that record keystrokes) meaning that even if you have malicious code on your system, it’s much less likely to pick up your passwords if they’re being dished out by say, Chrome.
Also, there are utilities that build LUC-key passwords for web sites for you (see The Passwords, above) and remember them as well. You can have 26-character long passwords with a crazy amount of random letters, numbers and symbols in them that you don’t have to remember yourself. Just look for extensions/add-ons for your browser that talk about password hashing*.
Also, don’t let just anyone use your devices, or at the very least your profiles. Not that your grandmother is a black-hat hacker but if you’re storing your passwords in your browser, then they are easily accessible if they’re on your profile. If they compromise your security, then you’re the one that’s going to pay. Make a guest account if possible and always have a password on your own. On my mobile devices I actually have two browsers. One that I use myself that stores all of my passwords, and another that I load up for others to use.
An Endless Battle
There’s a hundred more things you can do to make yourself even more secure, from building Truecrypt volumes or encrypting your whole drive/home directory. However, these ones listed above are all very easy to get going and offer an incredible boost in security. The passwords section in particular is widely considered a must-do list that everyone should be following. Keep these things in mind, do them and you’ll be on your way to making yourself a very undesirable target.
*”Password Hasher” for Firefox is an example. See figure to right for an example.
I have a bad time with MicroSD memory cards. Seriously, I have destroyed two of them in the past six months. I’d like to think that it’s because of a manufacturing defect, but I’m pretty sure it’s just my inherent clumsiness.
See, my tablet, a Galaxy Note 10.1 uses this type of storage and I spend a lot of time moving things to and from it. It’s usually large files or huge blocks of small files so it takes quite a lot of time unless I put the card itself into an adaptor and plug it into my computer. Even the USB linking ability through the port on the tablet is painfully slow and sometimes just plain doesn’t work.
Emailing the files was sometimes the solution, but was impractical for larger files. Some times I could transfer through a USB stick, but that too was cumbersome. A few programs existed that allowed transfer between a computer and the device over Wifi, but most of them were lacking in some key respect, or didn’t function as I needed. Then I found AirDroid.
AirDroid is not exactly new to the scene, and in fact when I actually broke down and started searching for a solution to my problem, it was the first one to pop up. So I grabbed the “light” version and was throwing things to and from my tablet within minutes. All you need to do is grant it superuser permissions (so it can read/write/get updates) and sign up for the service (if you want to use the optional web version).
The app has two ways of connecting to your tablet, both of which involve configuring your tablet to act as a kind of file server. The first of these is to directly connect to your tablet over your current Wifi by pointing your browser to a specific IP and port (usually [Local IP Here]:8888). Then, through the gorgeous GUI, you can add/remove files, contacts, ringtones (if it’s an Android phone of course) as well as just about anything else that resides on your device.
The second way is similar to the first, except that you go through the AirDroid website (web.airdroid.com) to transfer files. This is useful if your tablet/phone is at home and you need to get something off of it. Assuming that your AirDroid app is running and connected, you can grab your files from literally anywhere in the world. There is a 1GB transfer limit on this function if you’re using the free version, though. So keep that in mind if you’re trying to pull a movie or something from your device.
Both of these look identical, in that the web interface is the same for both. The GUI has a multitasking feature, letting you add/remove files at the same time while checking your notifications and anything else you have the bandwidth for, as well as stats on your device like its battery life and storage capacity.
AirDroid did not crash or hang the entire time I used it no matter how much stress I put it under. I was transferring several Gigabytes of files to and from it while poking around in my contacts and looking at photos. Also, I run my tablet through an SSL VPN and didn’t have any troubles from that setup either.
On the whole this is a brilliant piece of software and an absolute must-have for any Android user who moves a lot of data around their mobile devices, which is probably everyone. AirDroid2 should be coming soon to my device and I am definitely looking forward to that.
Rating: 5/5 – Absolutely Perfect. You need this app.
I can name on one hand the number of online courses that I’ve had that I enjoyed and really felt like I was part of the class. While a few have been decent, my experience with them on the whole has been less that admirable. In the right hands, and with the right professors, online courses can be good learning tools, but they invariably felt distant (not just physically) and left me wondering whether I really understood the material.
When I was given a choice of colleges I made sure that I focused on brick-and-mortar schools that I could get the bulk, if not all, of my education by showing up to an actual classroom with actual classes. But with any education nowadays, you’re going to run into classes that are unavoidably online only. For the most part, that’s fine. Not every class needs the one-on-one focus of an actual classroom and I can understand that.
I stated all the above as sort of a disclaimer about my position on the subject of distance learning. Put quite simply, I think that all-distance learning degrees (ones that are online from 100-level to completion) are junk.* Now, there’s a new trend of of MOOCs (Massive Online Open Courses) which are classes that can be taken, for free or very little cost, by anyone, even those who are not necessarily students of an educational system.
And to me this seems fine and interesting. There was an MOOC Stanford class a few months back on artificial intelligence that I thought was interesting. I might have taken it myself if I weren’t knee-deep in my own college classes at the time. However, now it is being considered that these courses should be college credit-worthy. Color me sceptical.
Color a whole bunch of professors sceptical as well. Whole departments of colleges and universities have flat-out told the administration that they will not participate. In the article below, they argue that the material cannot be presented properly and the class size is too big to be effective.
The AI class from Stanford that I talked about before drew 160,000 students. Did they all get a college credit’s worth of education? Call me a cynic, but I seriously doubt it. America should have a free or low-cost education system, as our current one is too costly to be useful or fair, but this is not the way to go about it. I fear we may devalue our higher education system and drive smaller schools into the dirt. MOOCs seem like a great idea in moderation, but we can’t see them as a substitute for “real” classes.
San Jose State University, one of the biggest academic supporters of the growing MOOC (massive open online course) movement, apparently has some vocal dissenters in its ranks.
In the past year, the university has welcomed MOOC providers like edX and Udacity with open arms — in addition to launching a first-of-its kind program with Udacity to award college credit for courses taken on its platform. The school has a growing partnership with edX and plans to create a dedicated resource center for California State University faculty statewide who are interested in online content.
But discord seems to brewing among some faculty. This week, professors in the Philosophy department said they refuse to teach an edX course on “justice” developed by a Harvard University professor, arguing that MOOCs come at “great peril” to their university.
*Again, I want to point out that I recently took a few online classes. In the hands of the right professor, class size and with the right subject, individual classes might be able to be done properly. This is the exception rather than the rule.
I tweeted the other day about Google’s new Chrome Office Viewer Extension (COVE?) that was in beta. It would allow users to see Office documents (as in the Microsoft kind) right in their web browser window. I excitedly talked about how it may move me to Chrome, because I do open a lot of web-hosted word processing documents. It sounded exciting!
Moving from one browser to another would be a herculean task for me, but I was willing to do it for such a neat feature, if it worked as advertised. While importing bookmarks are no big deal, moving my encrypted passwords (some to sites that I don’t even remember I used) and tying a Google account to it are not something that I particularly wanted. But I was willing to give it a try.
I downloaded Chrome on my laptop and desktop and set about getting the extension. However, I have been unable to get the extension to install. Google has disabled it for the two operating systems I use the most: Windows 8 and Ubuntu Linux. I even tried launching Google Chrome in Windows 8 Mode, but to no avail. While this is beta, I can’t be the only one who uses these two OSes, or just one of them exclusively.
This left me rather disappointed and solidified me more into the Firefox camp, where all my stuff resides anyway. Maybe I’ll keep Chrome around for a bit longer just to see what’s changed since I’ve last used it, or wait until the Office Viewer gets a proper release, but Firefox is still sitting pretty in my book. I’ll stay there and possibly try again when this comes out of Beta.