Rebel.com review

Rebel.com is a Canadian based ICANN accredited domain registrar (registered in Barbados). Besides providing domain name registrations they also offer e-mail and webhosting, but I have only used them as a domain registrar. They are from time to time running some pretty good promotions for both registrations and transfers, this is what initially caught my attention. After having used Rebel.com for about 18 months, I am now in the process of moving my few remaining domains away. All in all, I do not recommend Rebel.com. More details about my experiences are outlined below.

Review

The control panel is “messy” and takes a while to get used to and figure out. Especially the WHOIS contact information manager is painful to work with and the entire control panel is calling for a major makeover. Instead of providing a smooth and easy to use control panel, it looks like it was mainly designed as an advertisement for add-on products such as WHOIS privacy and webhosting.

By default, domains are set to auto renew. Personally, I prefer to turn auto renew off, so I can decide manually if I want to keep a domain with a given registrar, transfer it to another registrar, or if I just want to let the domain expire. To turn auto renew off you must contact support. This is annoying and something you easily can forget to do until it is too late. Clearly, this feature could easily be integrated into the control panel like the majority of other registrars have done.Rebel.com will automatically charge the renewal fee 45 days prior to a domain’s expiry date unless auto renew is turned off. It looks like an attempt to get recurring customers through obscurity.

If you forget to renew a domain, they charge up to $100 in “processing fees” to reactivate the domain. This is an extremely high fee in comparison to other major registrars that charge a considerably lower fee.

I have both transferred domains in and out of Rebel.com. Transferring in was smooth and easy, but transferring out domains was more complicated. Firstly, getting the authorization code (EPP code) and unlocking the domain took a while to figure out and is not documented in their help section. Secondly, they do not provide an option to explicit accept the transfer on their end. Not even their support team is capable of accepting a transfer out request. As a result you have to wait a minimum of five days before ICANN will force a transfer. My experience from other major registrars is that a transfer can be completed within a couple of hours.

My experience with support is one of the more positive aspects. I have contacted support a couple of times through e-mail, and they responded fairly fast, even during weekends. However, at times it looked like I was getting canned responses not addressing my question. Unfortunately, the reason why I had to contact support in the first place has been because of the troublesome control panel…

Pros

  • Fast response from support.
  • Free DNS with full control.
  • Attractive promotions from time to time.
  • Reliable registrar.

Cons

  • Registration and renewal costs are higher than other reputable registrars.
  • “Messy” control panel missing several features.
  • No free whois privacy (not even for the first year.)
  • Auto renew is turned on by default, to turn it off a support ticket is required.
  • Difficult and slow transfer out process.

Verdict

In conclusion, my meeting with Rebel.com has been mixed and I will not recommend them. They offer a standard domain registration service and are trustworthy, but in comparison to other well established registrars such as NameCheap, NameSilo etc. their domain service is more expensive, more troublesome, and with less features. Especially the hefty processing fees when reactivating expired domains nailed the coffin regarding my decision to move away all domains. I could be tempted to use them as a secondary registrar when they are running some good promotions, but only to transfer the domains out as fast as possible.

Active Involvement of Software Developers in Usability Engineering: Two Small-Scale Case Studies

Together with co-author Jan Stage we got a short paper accepted at the conference INTERACT 2017.

Abstract

The essence of usability evaluations is to produce feedback that supports the downstream utility so the interaction design can be improved and problems can be fixed. In practice, software development organizations experience several obstacles for conducting usability engineering. One suggested approach is to train and involve developers in all phases of usability activities from evaluations, to problem reporting, and making redesign proposals. Only limited work has previously investigated the impact of actively involving developers in usability engineering. In this paper, we present two small-scale case studies in which we investigate the developers’ experience of conducting usability evaluations and participating in a redesign workshop. In both case studies developers actively engaged in both activities. Per the developers, this approach supported problem understanding, severity ratings, and problem fixing. At the organizational level, we found that the attitude towards and understanding of the role of usability engineering improved.

Nis Bornoe and Jan Stage. 2017. Active Involvement of Software Developers in Usability Engineering: Two Small-Scale Case Studies. Human-Computer Interaction – INTERACT 2017. Lecture Notes in Computer Science. Springer.

A few words about selling UX

A while ago I attended a seminar about the differences and similarities between usability and UX and not least the problems of understanding, separating, and combining the two into something specific. During this seminar, a problem discussed among the practitioners was how to sell UX.

There are a number of challenges when presenting and selling UX to clients. UX is a “fuzzy” term not easily explicable when it comes to what is delivered, what UX looks or feels like, and how the value of UX is made quantifiable. For example, a chief product officer of a small software development organization told me once: “…how does one get usability included into business cases so that they are credible higher up in the system?”

The following advice was mentioned at the seminar when talking to clients:

  • Use facts.
  • Include UX people during sales meetings.
  • Include UX explicit into business cases.
  • Compare to UX strategies taken by competitors.
  • Show something “beautiful” early in the process.
  • Get allies in the organization.

BunnyCDN review

BunnyCDN (affiliate link) is a low-cost unmanaged content delivery network (CDN) service. Setting up the service and implementing the CDN into your website is an easy procedure, especially if you already understand the basics of a CDN.  BunnyCDN is one of the cheapest available CDN providers charging $0.01/GB for US and European traffic, and $0.03/GB for Asian and Oceanian traffic. In comparison a similar package from KeyCDN is priced at $0.04/GB and MaxCDN charge about $0.08/GB (and minimum $9/month.)

I recently signed up for a 14 days free trial period to evaluate BunnyCDN and this review reflect my first impression after using their CDN for a couple of weeks.

What is a CDN and why use one?

In essence, a CDN is about efficiently distributing content. A CDN is consisting of servers globally distributed at essential locations such as major internet traffic points and cities. Each server in the CDN will contain a cache of certain files and will serve the content to the visitors through the fastest possible route. A CDN is mainly used to host static content such as images, CSS and JS files, videos for download and streaming, and other files for download. Three majors points of using a CDN is to increase download speed for the visitors, have redundancy in case a server is down or overloaded, and to lower the load on the server hosting your website. A relative simple CDN such as BunnyCDN does not host dynamic content such as databases and server-side scripts. More advanced (and considerably more expensive) CDN providers can also host dynamic content, but this will increase complexity of the setup considerable, and will be an overkill for most websites. It is also important to note that a CDN is not an alternative backup solution. Generally, content is shared with a CDN through either a pull or a push zone. A pull zone connects to your server and download the static content to the CDN when the content is requested. This is the easiest option to setup and maintain. In a push zone setup, you upload the files directly to the CDN and do not need a local copy. This option requires more time to setup and maintain in comparison to a pull zone and also more expensive since you have to pay a storage fee for having the CDN hosting your files.

Is a CDN a must for all websites? absolutely not. This blog uses a CDN to serve most static content, but it does not really require a CDN. The amount of visitors could easily be handled by the shared server hosting the blog. The speed would be a bit slower without the use of a CDN, but this will hardly be noticeable for the majority of the visitors. The content of the specific website and the amount of daily visitors are two crucial factors when deciding to offload traffic to a CDN or not. Sites without many images, downloadable files e.g. PDF, PPT, DOC, video etc. will not benefit much from a CDN, but frequently updated blogs with a steady amount of visitors, ecommerce sites, and archives of PDF files are good examples of sites that will benefit.

Features

BunnyCDN provides many features, so I will just point out a few essential ones:

  • Multiple pull and push (currently in beta) zones.
  • Custom CNAME hostnames (setting up a custom CNAME hostname is recommended because 1) you have more control, for example, you can assign your own SSL to the hostname, 2) in case you decide to shift to a different CDN provider the transition will be easier, and 3) it looks more professional, and trustworthy.)
  • SSL support (you can use your own or get a free one from BunnyCDN.)
  • Blocking of countries, individual IP’s, and entire IP ranges (blocking happens at DNS level.)
  • Redirection of visitors from specific countries and regions (to save bandwidth costs it’s possible to redirect visitors to the closest cheapest possible datacenter. For example, you can redirect all traffic from Asia to US or EU servers to use cheaper bandwidth.)
  • All locations support both IPv4 and IPv6.
  • Option to purge the entire cache at once or just individual files.
  • No restrictions on file extension and size.
  • Hotlinking protection and URL authentication.
  • API access.
  • Basic statistics including real-time log monitoring.
  • Monthly traffic limit (avoid unexpected bills.)

The full list of feature can be found here: https://bunnycdn.com/features/

Network

As of this writing, BunnyCDN has 13 points of presence (PoP) aka. they have edge servers caching content in 13 different datacenters around the globe including data centers in Europe, The United States, and Asia & Oceania. A South American presence is planned, but not in operation yet, and it has still not been disclosed where the South American PoP will be located. Below is a list of current locations:

United States:

  • New York City, NY
  • Los Angeles, CA
  • Kansas City, MO
  • Dallas, TX
  • Atlanta, GA
  • Seattle, WA

Europe:

  • Paris, France
  • London, United Kingdom
  • Falkenstein, Germany
  • Bucharest, Romania

Asia & Oceania

  • Tokyo, Japan
  • Singapore
  • Sydney, Australia

User interface

The BunnyCDN user interface is very simple and fast to learn. It took me a couple of minutes to setup a new pull zone and under an hour to be more or less “fluent” in the entire interface and the functionalities provided. I believe beginners with basic knowledge about website development will be able to setup a pull or push zone and understand the basics within an hour or two. Some or the more advanced options such as SSL and advanced security settings will take a bit longer. The dashboard loads fast and smooth (since BunnyCDN is a CDN provider anything else would be strange). Everything is kept in a simple yet stylish manner and follows the standards of typical dashboard design.

Basic statistics about bandwidth usage, file requests, and cache hit rate is provided along real-time log monitoring. I am missing statistics about individual files, especially something like a top 25 list of most downloaded and most bandwidth consuming files. It would be very helpful to get an idea of what takes up bandwidth if you are just curious or want to optimize files. To get a more detailed understanding of the traffic, you will need to measure through other means.

Main page of the dashboard showing an overview of the account.
For each pull zone it’s possible to edit different parameters such as cache, security, and traffic settings.
Overview and settings of a push zone.

Price

BunnyCDN charge on a monthly basis for actual traffic used (per-byte billing accuracy) with no monthly minimum. While there are no monthly minimum payments, there is a yearly minimum of 5$. BunnyCDN accepts major credit cards and BitCoin, but unfortunately not PayPal. The pricing structure is simple and transparent as they provide one plan. Due to differences in bandwidth prices around the world, they charge according to the following scheme:

  • US and Europe: $0.010/GB
  •  Asia & Oceania: $0.030/GB
  • South America: unknown at the moment.

For storage used by push zones, they charge $0.02/GB per month (Currently, it seems that they do not charge for storage as this feature is in beta.)

Pricewise BunnyCDN is at the absolute bottom when comparing with similar CDN providers, as mentioned, a similar package from KeyCDN is priced at $0.04/GB, and MaxCDN charges about $0.8/GB.

It is possible to save bandwidth costs by redirect visitors to the closest cheapest possible data center. For example, you can choose only to serve traffic from US and European locations no matter where the visitor is located. This redirection sort of conflicts with the philosophy of using a CDN, but can help cut costs if needed and will in many cases still result in increased speed in comparison to not using a CDN at all.

According to a small nonsystematic evaluation, the statistics about bandwidth usage looks like to be pretty precise and matches well when comparing with my own calculations of expected bandwidth usage.

Speed

While I have not conducted a systematic evaluation of the download speed from BunnyCDN, I have tested the speed by downloading large files from different locations in the US and Europe and tested over several days. Overall the speed has been very satisfying. I have typical been able to download at 30MB/sec equivalent to about 240 Mbit/sec and have reached 90 MB/sec equivalent to about 720 Mbit/sec.

If you want to make your own simple speed test of BunnyCDN, you can download the official 100MB test file: http://test.b-cdn.net/100mb.bin

Uploading files to a push zone is also lightning fast. I was able to upload with 6MB/sec equivalent to about 48 Mbit/sec which is pretty good for the internet connection I was using.

As mentioned BunnyCDN offers a 14 days free trial period, so you will be able to do a more comprehensive evaluation to see if BunnyCDN meets your expectations and works for your specific needs.

Caching

An essential aspect of a CDN is that your content is cached at the different locations, so files actually are served through the CDN. The cache hit rate (aka. the amount of file requests served through the CDN) is currently about 60% which is on the lower site. Since I only recently started using BunnyCDN I suspect that the low hit rate simply means that the caching process is still in progress. I expect this to increase to about 80-90% during the coming days. Content is only cached at a given location if someone actually request content from that location.

Support

BunnyCDN only provides support through e-mail. I have not needed support so far, but according to other customers, support requests are handled within a day. The number of tutorials and documentation provided by BunnyCDN is relatively limited at this point. Therefore users of BunnyCDN should be able and willing to do must stuff on their own, but through learning by doing most users should be able to get things to work.

Conclusion

In conclusion, I find BunnyCDN to be an interesting low-cost CDN option for private users and small businesses with basic feature requirements and limited budgets. Of positive aspects, I would point out the simplicity of the interface and API, the speed and network, and the low cost and the transparency of the pricing structure. It’s easy to get both a pull and push zone up and running, and the caching of files seems to be done quite quickly as within a few minutes. Of disadvantages I find the statistics to be limited, and I would like statistics about individual files. They only provide e-mail support and no uptime guarantee, there is no DDOS protection in place, and the documentation is limited at this point. The service is still under development so expect minor “blips” such as links not working, sections partially under development, and missing and outdated information around the website.

Ready to try out BunnyCDN? then click here (affiliate link) and get your 14 days free trail (no payment information required.)

Joining the ACM IUI 2017 program committee

I have joined the ACM IUI 2017 (The 22th International Conference on Intelligent User Interfaces) program committee. The program committee of the IUI conference is slightly different organized in comparison to other ACM conferences. Usually, the members of a program committee will assign each submission a number of external reviewers and write a meta-review. At IUI the program committee is divided into two subgroups, a senior program committee, and a (junior) program committee. Each submission will be assigned three reviewers, two (junior) PC members, and one senior PC member. What diversifies the (junior) program committee from being external reviewers is that we are more linked to the process and participates in activities such as paper bidding and discussion. As a Ph.D. student, this is an excellent opportunity to get a gentle introduction to the work and assignments handled by a conference program committee.

The 22th International Conference on Intelligent User Interfaces (IUI ’17) will take place March 13-16, 2017, Limassol, Cyprus and the scope of the conference is described as:

At ACM IUI, we focus on the interaction between machine intelligence and human intelligence. While other conferences focus on one side or the other, we address the complex interaction between the two. We welcome research that explores how to make the interaction between computers and people smarter, which may leverage solutions from data mining, knowledge representation, novel interaction paradigms, and emerging technologies. We strongly encourage submissions that discuss research from both HCI and AI simultaneously, but also welcome works that focus more on one side or the other.

Facilitating Redesign with Design Cards: Experiences with Novice Designers

Together with coauthors Anders Bruun and Jan Stage we got a paper accepted at OzCHI 2016, the Annual Conference of the Australian Computer-Human Interaction Special Interest Group. OzCHI 2016 is taking place in Launceston, Tasmania from November 29th – December 2nd 2016.

Abstract

While effort has been put into developing and evaluating usability evaluation methods less attention has been paid to shifting usability feedback into improved designs. We report from a study with 44 novice designers creating redesign suggestions. Some were provided with domain specific design cards to facilitate the redesign process. Design cards are physical cards used to structure a collaborative process, and providing design cues such as keywords and questions. Afterward, three developers assessed the quality of the suggestions. We found that the cards diversified the range of system aspects that novices considered, supported ideation, and kept the discussion going. However, the cards did not compensate the limited design experience, and the participants had challenges understanding the value of the cards, and implement them in the process. Having developers assessing the subjective quality of the suggestions turned out to be challenging due to low inter-rater reliability.

Nis Bornoe, Anders Bruun and Jan Stage. 2016. Facilitating Redesign with Design Cards: Experiences with Novice Designers. In Proceedings of the 28th Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Connceted Futures (OZCHI ’16). ACM, New York, NY, USA.

Review submitted to OzCHI 2016

OzCHI 2016Submitted a review to the OzCHI 2016 conference with the formal name “The Annual Meeting of the Australian Special Interest Group for Computer Human Interaction.” Here at our group, The Research Centre for Socio+Interactive Design at Aalborg University, we have for several years participated in and supported this event. This is the first time I am reviewing for OzCHI and the second time submitting a full paper. This years theme is “Connected futures.”

OzCHI 2016 will be held in Launceston, Tasmania, November 29th through to December 2nd 2016.

Work in progress: case study about the impact of usability work

TonePrint Editor old version
Figure 1: Screenshot of the original version of the TonePrint Editor.

Currently, I’m spending a part of my summer writing a small-scale case study about the impact of usability work. Back in 2014, I was part of a team arranging a redesign workshop [1] for a development team at the company TC Electronic, now known as Music Group Innovation. They wanted to evaluate and improve an application called TonePrint Editor (see figure 1.) The essence of the workshop was to facilitate the development team fixing a list of previously identified usability problems. I recently returned to Music Group Innovation to conduct a small-scale case study about the redesign process of the TonePrint Editor. I wanted to do a follow-up to explore the process around making design changes to the application, and what had happened with the identified usability problems.

Project introduction

The use of usability evaluation methods is a widely accepted and used approach during iterative software development. One form of usability evaluations is the formative approach often conducted with a think aloud method. Formative usability evaluations are used to get feedback about users’ behavior when using an application and to get the users qualitative feedback about the concepts and designs used. The feedback reveals insights about how the users perceive, understand and interact with a system, insights that can be used to improve and develop an application. A lot of research has focused on both developing different usability evaluation methods, and evaluating the effectivity of existing methods [6]. Less attention has been paid to the relationship between the output from evaluations and improvements made to a given application [7], such research is complicated, time consuming and resource demanding [5]. Dennis Wixon boils this down to that: “…problems should be fixed and not just found” [8]. Since the point of making usability evaluations is to end with an improved interaction experience it is relevant to investigate what happens with identified usability problems, how the developers use the feedback from evaluations, and what they perceive as useful about the insights.

TonePrint App new version
Figure 2: Screenshot of the new version of the TonePrint App.

In the redesign workshop running two years ago [1], the main focus was to have the developers engaging in an innovative redesign suggestion process through active involvement. As part of the workshop, we included a short lecture about basic principles of interaction design. The intention with the lecture was to have the developers think about UI design in more broad terms and get inspiration for coming up with redesign proposals. Before the workshop, the developers had conducted a formative usability evaluation and compiled a usability problem list consisting of 19 problems. The outcome of the workshop was ideas for changing the flow of the main screen and minor changes to the interface. After participating in the redesign workshop, the development team has continued the redesign process, and have made several changes to the application.

During my revisit at Music Group Innovation I conducted a semi-structured interview with two members of the development team, a product manager, and a developer. We spent a couple of hours talking about the changes made to the application, the redesign process, and impact on the organization after engaging in user-centered design.

Preliminary insights

I’m still in the process of analyzing the interview in detail, but I will here outline a couple of interesting insights.

Prioritizing usability problems

When we talked about the list of the 19 identified usability problems, the first step after the evaluation was to prioritize the problems to decided on which ones to fix with. During the compilation of the list, a classic severity rating in the form: minor, moderate and severe was given. Additionally, two other ratings were added. The interviewed programmer would give a complexity rating (1-8). This rating is the estimated technical complexity of fixing the problem. The interviewed project manager would give a business value rating (1-8). This rating is the estimated importance related to the functionality of the application. Both are also related to the resource requirement estimation. The three ratings were then used to decide which problems to prioritize. Through this prioritizing process, the development team was able to understand and analyze the problems from more angles. This also served to make the fixing of usability problems more goal oriented. Initially, they prioritized seven problems. In the end, the team made fixes for 14 problems.

Getting problems confirmed and extended

Consistent with conclusions from another study [3], including the development team in the formative evaluation provided the developers with a more specific understanding of the usability problems. This understanding is more detailed that simply reading a usability problem report [3]. They were already aware of, or had ideas about possible usability problems, but in line with findings from another study [4], they found it useful to get confirmation or disconfirmation. What is more interesting is that their fuzzy ideas about problems were concretized and extended. For example, the flow of operations on a particular screen was not in line with the flow of operations found logic by the users, and how the users wanted to interact with the application. Getting insights into this design flaw was by the project manager characterized as a big eye opener. This was not identified as a specific usability problem but was by the development team identified as a more generic design problem leading to usability problems. Interestingly, the most significant redesign considerations sparked around feedback mainly gained through the involvement in the evaluation and redesign workshop, and less on the specific usability problems.

Design changes

During the redesign process, a couple of significant design changes was decided.

As mentioned above the flow of operations and order of options on a screen was found to be problematic. While this was not a specific usability problem, the development team decided to work on this problem during the redesign workshop. During the initial design of the application, they wanted to make the application ‘flashy’ (see figure 1.) During the workshop, they instead created redesign proposals based on the insights from the evaluation and the basic interaction design principles introduced during the short lecture. Afterward, they further evolved these proposals into a specific deign (see figure 2.) Similar findings have been reported by previously work [2].

At the time of the usability evaluation and redesign workshop, two applications existed the TonePrint Editor and the TonePrint App. The two applications have been merged into one application that runs on all major devices to make it easier for the users.

A couple of take-aways

The process of prioritizing the identified usability problems made the fixing of usability problems more goal oriented. For example, instead of simply adding problems to the backlog, there were some clear thoughts behind what problems to prioritize. This included considering severity, complexity, and business value ratings, as well as the estimated resources needed to fix a given problem.

Having the development team actively involved in both the formative usability evaluation and the redesign workshop provided insights about the current application design that would not have been gained if both the evaluation and creation of redesign proposals had been outsourced. Regarding the identified usability problems the developers got a more specific understanding and extensive understanding besides what was reported in the usability problem list. Insights about the current state of the usability of an application do not merely come from reading a report based on a formative usability evaluation.

The redesign workshop provided a frame for working with the insights. This sparked new ideas and a set of redesign proposals that were later matured and evolved into implementable designs. The final design shows that the development team was able to combine insights from the evaluation with basic principles of interaction design.

The short conclusion is that usability work makes sense and have an impact as long as the understanding of usability work goes beyond purely conducting usability evaluations.

Upcoming work

During the upcoming weeks, I will do a more comprehensive analysis of the interview and investigate the above themes in more detail. Along with my co-authors, we will be submitting this case study to the Industry Experiences track at the NordiCHI 2016 conference, so crossing fingers that the reviewers will find the paper interesting enough for a presentation.

References

  1. Bornoe, N., Billestrup, J., Andersen, J. L., Stage, J., & Bruun, A. (2014, October). Redesign workshop: involving software developers actively in usability engineering. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (pp. 1113-1118). ACM. DOI: 10.1145/2639189.2670288
  2. Bruun, A., Jensen, J. J., Skov, M. B., & Stage, J. (2014, September). Active Collaborative Learning: Supporting Software Developers in Creating Redesign Proposals. In International Conference on Human-Centred Software Engineering (pp. 1-18). Springer Berlin Heidelberg. DOI: 10.1007/978-3-662-44811-3_1
  3. Hoegh, R. T., Nielsen, C. M., Overgaard, M., Pedersen, M. B., & Stage, J. (2006). The impact of usability reports and user test observations on developers’ understanding of usability data: An exploratory study. International journal of human-computer interaction, 21(2), 173-196. DOI: 10.1207/s15327590ijhc2102_4
  4. Hornbæk, K., & Frøkjær, E. (2005, April). Comparing usability problems and redesign proposals as input to practical systems development. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 391-400). ACM. DOI: 10.1145/1005261.1005274
  5. Law, E. L. C. (2006). Evaluating the downstream utility of user tests and examining the developer effect: A case study. International Journal of Human-Computer Interaction, 21(2), 147-172. DOI: 10.1207/s15327590ijhc2102_3
  6. Nørgaard, M., & Hornbæk, K. (2009). Exploring the value of usability feedback formats. Intl. Journal of Human–Computer Interaction, 25(1), 49-74. DOI: 10.1080/10447310802546708
  7. Uldall-Espersen, T., Frøkjær, E., & Hornbæk, K. (2008). Tracing impact in a usability improvement process. Interacting with Computers, 20(1), 48-63. DOI: 10.1016/j.intcom.2007.08.001
  8. Wixon, D. (2003). Evaluating usability methods: why the current literature fails the practitioner. interactions, 10(4), 28-34. DOI: 10.1145/838830.838870