The Most Critical Market Trends in Higher Education – Part 2

The Most Critical Market Trends in Higher Education – Part 2

In Part 1, we examined the first two critical trends affecting the higher education market. Those being: Digital Equity and BYOD Support. In this article, we’ll look at the remaining three trends facing IT leaders.

Rethink, Repurpose, and Reclaim

First up is the desire to rethink, repurpose, and reclaim space across campus. This can be to create collaborative learning spaces or to convert back to instruction spaces, or something different altogether. High on the list for most campuses are to reimagine the “old and dusty” computer labs. Moving the delivery of these workstations to something more flexible enables campuses to reclaim the valuable real estate they so desperately seek. But how?

One solution to this problem is shifting to on-premise VDI (virtual desktop infrastructure), however, this is a very expensive and complex initiative. Most schools will spend close to $1mil in investment and 6 months to a year of planning and execution.

A better option is to move those physical labs to the cloud. Here, there are two distinct approaches: Do-it-yourself (DIY) or a managed service, like Apporto. While the DIY option probably feels like the cheaper option, that’s not always the case. There are many horror stories in the news where a newly appointed cloud admin got a configuration wrong and campus data became publicly available, as well as concerns and reports over excessive monthly bills. I recently spoke to a school that let faculty members create and manage their own cloud desktops. Unfortunately, they had one faculty build a massive research desktop and then forget to power it off over the holiday break. Administrators were shocked when faced with a $50,000 cost overrun just for the month of December!

This points to choosing a managed service provider for the school’s cloud computing needs, but not all managed service offerings are created equal. Oftentimes, the provider does not want to be responsible for customer mistakes and therefore will lock and prevent access to the school’s own environment. This creates a bit of a hostage situation where the customer is now beholden to the time, availability, and charges for the managed service provider to complete basic tasks. (We do things differently at Apporto.)

This leads us to our fourth market trend: Cloud First.

Does every workload belong in the cloud? Many colleges and universities are coming to the conclusion that no, in fact, they don’t. What we’re starting to see more of is a Cloud Smart approach where IT leaders are looking closely at the benefits and costs of moving specific workloads to the cloud.

One cloud model that is certainly outpacing all others is that of SaaS or software-as-a-service. We’re seeing this model adopted across all sectors of higher education, from enrollment systems to finance, from learning management systems to even some academic software titles. SaaS is probably the easiest way to move systems to the cloud – think Netflix – and is how we built our offering at Apporto.

 

Virtual Desktop Provider Comparison

See how Apporto stacks up against the most popular virtualization technologies available today
Top 7 Vendors Comparison

Shrinking IT Budget

The final market trend facing higher education has also been around for years, but certainly is more prevalent in these days of the great resignation, uncertain economic times, and constant upheaval in the job market: Shrinking IT Budgets. Anyone who works in education is no stranger to this trend.

As if to add insult to injury, IT leaders are trying to consider the previous four trends we covered, find and implement solutions, and do so without any extra budget or staff.  Here is where it pays to be Cloud Smart and find a vendor that can help address all of the prevailing market trends and deliver a solution at a reasonable price.

In addition to the trends mentioned earlier, it is important to keep an eye on unexpected trends, as well. Recent months have shown that unexpected developments, such as the emergence of ChatGPT, can quickly become game-changers in the industry. In the past, other trends like the Mac vs. Windows debate and the early days of the Internet were similarly unexpected and had a significant impact on the industry. Higher education leaders face the challenge of being able to change, pivot, adopt, or defend against new technologies with agility. While most schools I’ve spoken with have a 5-year strategic plan for vision-level planning, they often struggle with the details and integration when it comes to new technologies. Therefore, it is crucial for leaders to be prepared for the unexpected and to approach it with a positive mindset. Overall, embracing the unexpected can lead to innovative solutions and better outcomes for the education industry.

Conclusion

I would submit that while the landscape for campus computing has changed dramatically in recent years, there are good options available for all use-cases. It’s important to be flexible with long-term plans and seek out solutions that balance staff effort against cost, but that don’t negatively impact the student experience. I have a solution in mind. What about you?

Happy Computing!

Technology Comparison: DaaS versus Application Virtualization

Daas Application Virtualization

Technology Comparison: DaaS versus Application Virtualization

There is a long history in higher education of trying to solve the challenge of delivering the right academic software title to the right student at the right time. Different solutions take different approaches.

For example, SCCM and Intune focus on the physical image and management of the physical desktop fleet. Another approach is to utilize an app layering technology such as MSI/MSIX, Turbo.net, or Cloudpaging. A third approach is to simply virtualize the whole desktop, apps, and all, using VMware or Citrix.

Each of these methods has both advantages and disadvantages, as well as dramatically different costs. And now we also have cloud-hosted solutions to consider further expanding our list of choices. In this article, we’ll focus on the benefits and challenges of a cloud-hosted desktops-as-a-service (DaaS) solution, like Apporto, and compare those to the world of black art known as application virtualization.

DaaS

First, what is desktops-as-a-service? Desktops as a service (DaaS) is a cloud-based delivery model for virtual desktop infrastructure (VDI) that enables organizations to provide virtual desktops to end-users from a remote data center or cloud provider.

In a DaaS environment, the desktop operating system, applications, and data are hosted in the cloud and delivered over the internet to a user’s device. This means that end-users can access their desktop environment from any device with an internet connection, without the need for a traditional desktop computer or local installation of software.

DaaS can provide several benefits for organizations, including:

  • Scalability: DaaS enables organizations to scale their virtual desktop infrastructure up or down to meet changing demand, without the need for significant upfront investment in hardware or infrastructure.
  • Flexibility: With DaaS, end-users can access their desktop environment from any device with an internet connection, making it easier to support a mobile or remote workforce.
  • Simplified management: By moving desktop infrastructure to the cloud, organizations can simplify management and reduce the burden of maintenance and updates.
  • Improved security: DaaS can help to improve security by centralizing desktop infrastructure in a secure data center or cloud provider, reducing the risk of data breaches or other security threats.

However, it’s worth noting that DaaS may not be suitable for all organizations or use cases. Factors such as network latency, bandwidth, and application performance can impact the user experience in a DaaS environment, and some organizations may have specific compliance or regulatory requirements that cannot be met through a cloud-based solution.

Application Virtualization

Now let’s review application virtualization. Application virtualization is a technology that allows applications to be delivered and executed on a user’s computer without being installed on the local operating system. Instead, the application runs in a virtual environment, isolated from the underlying operating system and other applications.

There are several benefits to application virtualization, including:

  • Simplified application management: With application virtualization, applications can be centrally managed and delivered to end-users on demand, without the need for manual installation or maintenance.
  • Improved compatibility: By isolating applications from the underlying operating system, application virtualization can help to overcome compatibility issues with other software and hardware.
  • Enhanced security: Application virtualization can help to improve security by isolating applications and their associated data from other applications and the underlying operating system, reducing the risk of malware or other security threats.
  • Flexibility: With application virtualization, applications can be delivered to a wide range of devices and operating systems, making it easier to support a diverse user base.

There are some pitfalls and challenges with application virtualization, though, as we’ll see.

So, how do these two very different technologies compare and contrast? There are two main criteria that should be considered. The first relates to the effort involved to build, maintain, and manage the system. The second is around where the horsepower needed to run the application actually comes from.

Virtual Desktop Provider Comparison

See how Apporto stacks up against the most popular virtualization technologies available today

Effort and TCO

With both solutions, there will be some initial time and effort required to scope and assess the requirements for the project. There will also be project management oversight and tracking of deliverables to ensure success. But that’s where the similarities come to an end.

DIY versus DaaS/SaaS

The single biggest difference between these two delivery technologies is the effort required to get things off the ground and running smoothly. Nearly all application virtualization products fall into the do-it-yourself (DIY) category and require hundreds of hours to learn the preparation and packaging process. Repackaging an application is a black art that must be mastered through training, practice, and trial and error. Standard steps include:

  1. Application Assessment: The first step in creating a virtualized application is to assess the application’s requirements, dependencies, and compatibility with the virtualization solution. This involves identifying any dependencies on specific versions of operating system libraries, registry entries, or other components that may need to be packaged with the virtualized application.
  2. Application Sequencing: Once the application has been assessed, the next step is to sequence the application. Sequencing involves capturing the application installation and creating a package that contains all the files, settings, and dependencies required to run the application in a virtualized environment. The sequencing process typically involves installing the application on a clean virtual machine and using a sequencing tool to capture changes made to the file system, registry, and other settings.
  3. Package Customization: After the application has been sequenced, the next step is to customize the package to meet the organization’s requirements. This may involve configuring settings such as license keys, default preferences, or other customizations that are specific to the organization.
  4. Package Testing: Once the package has been customized, it’s important to test the virtualized application to ensure that it works as expected. This involves testing the application in a virtual environment and verifying that it runs without errors and behaves as expected.
  5. Package Deployment: After the package has been tested, it’s ready for deployment. The package can be deployed to end-users using a variety of methods, such as publishing it to a virtual desktop infrastructure, deploying it through a software distribution system, or providing it as a download from a web portal.
  6. Package Maintenance: After the package has been deployed, ongoing maintenance may be required to ensure that it remains compatible with the virtualization solution and any changes to the underlying operating system. This may involve updating the package to include new dependencies or addressing compatibility issues that arise due to changes in the virtualization solution or underlying operating system.

As you can see, this solution is not for the faint of heart and requires a deep level of technical understanding of both software architectures and Operating Systems. Another possible point to consider is whether the original license agreement prohibits modifying the installation media or process provided by the software publisher which could render your organization in legal trouble.

Application Virtualization Matrix:  https://www.whatmatrix.com/comparison/Application-Virtualization#

Desktops-as-a-service (DaaS) typically come in two flavors: DIY or managed service. The steps for deploying DaaS as a do-it-yourself exercise are comprised of:

  1. Assessing Requirements: The first step in deploying DaaS is to assess the organization’s requirements and determine if DaaS is the best fit. This may involve evaluating the current desktop environment, identifying pain points, and determining if there are use cases that would benefit from DaaS.
  2. Choosing a DaaS Provider: Once the decision is made to move forward with DaaS, the organization will need to choose a DaaS provider. This involves evaluating different providers based on factors such as pricing, performance, scalability, and support.
  3. Configuring the Virtual Desktop Environment: Once a DaaS provider is chosen, the organization will need to configure the virtual desktop environment. This involves setting up the virtual environment, provisioning virtual desktops, and ensuring that the virtual desktops are compatible with the organization’s applications and data.
  4. Testing and Deployment: Before deploying virtual desktops to end-users, it’s important to thoroughly test the virtual environment and ensure that the desktops meet the organization’s requirements for performance, security, and functionality.
  5. Integration with Applications and Data: Once the virtual desktop environment is deployed, the organization will need to integrate the virtual desktops with its applications and data. This may involve configuring access to databases, file shares, and other resources that the organization relies on.
  6. End-User Access: After the virtual desktop environment is fully configured, end-users can access their virtual desktops from any device with an internet connection. The organization may need to provide training and support to end-users to help them get started with using their virtual desktops.
  7. Monitoring and Maintenance: Once the virtual desktops are in use, ongoing monitoring and maintenance will be required to ensure that the virtual environment remains secure, up-to-date, and optimized for performance. This may involve monitoring virtual desktop usage, troubleshooting issues, and applying updates or patches to the virtual environment as needed.

Again, not really for the faint of heart! Here again, a deep technical understanding of compute, storage, networking, and security will all be required for a successful implementation of DaaS, whether deployed on-premises or cloud-hosted.

Luckily, there is the other option of DaaS through a managed service provider. Like Apporto. This makes the entire process turnkey with very little effort required by the organization’s IT department. Here customers enjoy a more full-fledged software-as-a-service (SaaS) experience such as they would with Netflix or Salesforce.

Horsepower

Lastly, let’s consider an essential aspect of the equation: horsepower, or specifically, who provides it. In any application virtualization solution, the goal is to deliver the virtualized software package to an endpoint, where end-users can run it for their daily tasks. However, this means that each endpoint must meet the minimum requirements of the software or the virtual package will not function correctly, or at all. This can create additional cost requirements for organizations to provide endpoints or put additional cost requirements on users to supply appropriately configured endpoints.

With DaaS solutions, the horsepower is coming from the backend servers whether hosted on-premise or in the cloud. There is still a cost associated with this solution, but it does free users from having to buy a specific type of device. This is especially important in higher education where the burden of tuition already weighs heavily on students.

In Summary

To summarize all of these points in table form, we come up with the following:

Although application virtualization may seem exciting and attractive at first glance, the reality of deploying it at scale can pose various challenges for organizations. Throughout my career, I’ve worked with several application virtualization products, and while they promised a lot, none could deliver 100% on their promises. However, DaaS – especially with the application of SaaS methodologies, as demonstrated by Apporto – provides a reliable solution worth exploring. Over 200 satisfied customers can attest to this.

Happy Computing!

Best Ways to Achieve 100% Endpoint Compliance

Best Ways to Achieve 100% Endpoint Compliance

IT professionals understand that securing their internal systems and data starts with securing the endpoints that their various user groups utilize for daily tasks. But with a shift in the landscape to a more mobile, hybrid, and remote workforce, how best to accomplish the target of 100% compliance on endpoints?

There has been a clear progression of management systems over time from System Center Configuration Manager (SCCM) to virtual desktop infrastructure (VDI) to desktops-as-a-service (DaaS). Each has benefits and drawbacks, so let’s dig in.

SCCM

Microsoft System Center Configuration Manager (SCCM) is a comprehensive management tool designed to help administrators deploy, manage, and monitor devices and applications in an enterprise environment. It provides a centralized platform for IT professionals to automate various tasks related to software deployment, patch management, operating system deployment, and system updates.

Key Features and Functions:

Software Deployment: SCCM enables administrators to efficiently deploy software applications across multiple devices within an organization. It supports automated software installation, remote installation, and deployment targeting based on user or device-specific criteria.

Patch Management: SCCM helps administrators keep the software and operating systems on devices up to date by managing and deploying patches and updates. It allows for patching both Microsoft and third-party applications, ensuring security and stability across the network.

Operating System Deployment: SCCM facilitates the automated deployment of operating systems to new or existing devices. Administrators can create standardized OS images, customize configurations, and remotely install the OS on multiple devices simultaneously.

Inventory and Asset Management: SCCM provides comprehensive inventory and asset management capabilities, allowing administrators to track and manage hardware and software assets across the organization. It collects detailed information about devices, software installations, and hardware configurations.

Endpoint Protection: SCCM integrates with Microsoft Defender Antivirus to provide endpoint protection features. Administrators can centrally manage antivirus policies, monitor protection status, and respond to security threats.

Reporting and Monitoring: SCCM offers reporting and monitoring tools to gather insights into device and application health, compliance, and usage. It provides real-time monitoring and generates reports that help administrators make informed decisions.

Challenges and Drawbacks:

While SCCM does a great job with most devices, there are a few areas that can be challenging and potentially block the achievement of 100% compliance.

Complexity: SCCM is a feature-rich and highly configurable tool, which can lead to complexity in its implementation and management. Setting up SCCM requires careful planning, expertise, and familiarity with its various components and configurations.

Learning Curve: Due to its complexity, SCCM has a steep learning curve for administrators who are new to the tool. Acquiring the necessary skills and knowledge to effectively utilize SCCM may take time and training.

Infrastructure Requirements: SCCM relies on a robust infrastructure to operate efficiently. It requires dedicated server resources, such as database servers, distribution points, and management points. Organizations need to allocate the necessary hardware, network bandwidth, and storage capacity to support SCCM effectively.

Scalability and Performance: Large-scale deployments or managing a vast number of devices can strain the performance of SCCM infrastructure. Ensuring scalability and optimal performance may require careful monitoring, tuning, and additional hardware resources.

Software Compatibility: SCCM primarily focuses on managing Microsoft-based systems and applications. While it supports third-party software deployments, ensuring compatibility and smooth integration with all applications can be challenging. Some third-party applications may require additional customization or workarounds for effective management.

Overhead and Maintenance: SCCM requires regular maintenance tasks, such as software updates, database maintenance, and distribution point management. These activities may require dedicated resources and can consume time and effort.

Add to the above, the robust knowledge of internal device driver management across many manufacturers and models that is required. And, finally, in the new remote work landscape, SCCM only works with devices it can connect to.  Offline or unreachable devices will not receive critical updates and patches.

As illustrated above, the total cost of ownership using SCCM is fairly reasonable but the effort required to achieve 100% endpoint compliance is high.

VDI

Virtual Desktop Infrastructure (VDI) is a technology that enables the delivery of desktop environments to end-users from a centralized server or cloud infrastructure. Instead of running applications and storing data on individual physical devices, VDI allows users to access a virtual desktop from any device with an internet connection.

Benefits of VDI:

Centralized Desktop Management: VDI centralizes desktop management, allowing administrators to deploy, configure, and update desktop environments from a single location. This simplifies IT tasks, reduces maintenance efforts, and ensures consistent configurations across all virtual desktops.

Remote Access and Mobility: VDI provides remote access to desktops, enabling users to access their virtual desktops and applications from anywhere, using various devices like laptops, tablets, or thin clients. This enhances productivity and facilitates mobile and remote work scenarios.

Improved Security: With VDI, sensitive data remains stored in the data center or cloud infrastructure, rather than on individual devices. This helps reduce the risk of data loss or theft from lost or stolen devices. Centralized security policies and controls can be implemented to protect virtual desktops and data.

Hardware Utilization: VDI allows for better utilization of hardware resources. Multiple virtual desktops can run on a single physical server, reducing the overall hardware requirements and energy consumption. This can result in cost savings and improved resource efficiency.

Streamlined Application Deployment: Applications can be installed and managed centrally in a VDI environment, reducing the complexities of application management across multiple endpoints. Administrators can easily update and deploy applications to all virtual desktops simultaneously, ensuring consistency and simplifying maintenance.

Enhanced Disaster Recovery: VDI environments can be backed up and replicated, making disaster recovery easier and more efficient. In case of hardware failures or other disruptions, users can quickly switch to alternative virtual desktops without significant downtime.

User Experience and Flexibility: VDI provides a consistent user experience across different devices, as users can access their personalized virtual desktop environment from any compatible endpoint. Users can easily switch devices without interrupting their work and have the flexibility to customize their virtual desktop to suit their preferences.

Challenges and drawbacks

The overall intent of VDI was to overcome the drawbacks of using an endpoint management tool like SCCM, however, there are some new challenges that must be taken into consideration.

Infrastructure Complexity: Implementing and managing a VDI infrastructure can be complex and resource-intensive. It requires robust server hardware, storage systems, and networking infrastructure to support the virtual desktops and ensure optimal performance. Organizations need to invest in the right infrastructure and have skilled IT personnel to handle the complexities.

Cost Considerations: VDI can involve significant upfront costs, including hardware, software licenses, virtualization technology, and storage infrastructure. Additionally, ongoing operational costs such as maintenance, upgrades, and support should be factored in. Organizations need to evaluate the total cost of ownership and determine if the benefits justify the investment.

Scalability and Performance: Scaling a VDI environment to accommodate a large number of users or handle peak workloads can be challenging. Ensuring adequate server resources, network bandwidth, and storage capacity are critical for maintaining performance and responsiveness. Organizations need to plan for scalability and regularly monitor and optimize the infrastructure.

User Experience: While VDI aims to provide a consistent user experience, factors like network connectivity, latency, and device capabilities can impact performance. In remote or low-bandwidth scenarios, users may experience lag or reduced functionality. Ensuring a satisfactory user experience across different devices and locations can require careful planning and optimization.

Application Compatibility: Certain applications may not be compatible with a virtualized environment or may require specific configurations. Graphics-intensive applications, legacy software, or applications with hardware dependencies can present challenges in VDI deployments. Compatibility testing and potential workarounds may be needed to ensure smooth application delivery.

Network Bandwidth Requirements: VDI heavily relies on network connectivity to deliver virtual desktops and transmit data between the server and endpoints. Bandwidth requirements can be significant, especially during peak usage times or when multimedia content is involved. Adequate network capacity and proper network design are crucial to prevent performance bottlenecks.

Data Security and Compliance: While VDI can enhance security by centralizing data and applications, it also introduces new security considerations. Securing the virtual desktop infrastructure, protecting data during transmission, and ensuring compliance with regulations require proper planning and implementation of security measures, including access controls, encryption, and monitoring.

While the benefit of centralized management and nearly immediate compliance with virtual desktops is appealing, the total cost of ownership (TCO) is a huge detractor for most companies.

Here we can see that achieving 100% compliance with VDI requires a fairly low effort, however, the total cost of ownership is probably the highest of all solutions.

Virtual Desktop Provider Comparison

See how Apporto stacks up against the most popular virtualization technologies available today

DaaS

Desktops-as-a-Service (DaaS) is a cloud computing model that delivers virtual desktop infrastructure (VDI) from a service provider to end-users over the internet. In DaaS, the virtual desktops and associated applications are hosted and managed in the cloud, eliminating the need for organizations to deploy and maintain their own VDI infrastructure.

Key Features and Benefits:

Cloud-based Virtual Desktops: DaaS provides virtual desktops that run in the cloud, allowing end-users to access their desktop environments from various devices, including laptops, tablets, and thin clients. The virtual desktops are hosted and managed by a third-party service provider, relieving organizations of the infrastructure and management responsibilities.

Pay-as-you-go Model: DaaS typically follows a subscription-based or pay-as-you-go pricing model. Organizations pay for the virtual desktops and services they consume on a per-user or per-month basis. This offers flexibility and scalability as organizations can easily scale up or down based on their needs without upfront capital investments.

Outsourced Infrastructure Management: With DaaS, the service provider handles the management and maintenance of the virtual desktop infrastructure, including server hardware, storage, networking, and software updates. This frees up IT resources and reduces the burden of infrastructure management for organizations.

Anywhere, Anytime Access: DaaS enables users to access their virtual desktops from anywhere with an internet connection. Users can log in and securely access their personalized desktop environment, applications, and data from different devices, facilitating remote work, mobile productivity, and collaboration.

Simplified Deployment and Management: DaaS simplifies the deployment and management of virtual desktops. Organizations can quickly provision new desktop instances, manage user access and permissions, and deploy applications centrally through an administrative portal. This streamlines IT operations and reduces the time and effort required for desktop management.

Enhanced Security and Data Protection: DaaS offers built-in security features and data protection mechanisms. Data resides in the cloud infrastructure, reducing the risk of data loss from lost or stolen devices. Service providers implement security measures such as access controls, data encryption, and backup solutions to ensure the security and integrity of virtual desktops and user data.

High Availability and Disaster Recovery: DaaS providers typically offer high availability and redundancy in their infrastructure, ensuring that virtual desktops are accessible and reliable. In case of hardware failures or disruptions, service providers maintain backups and implement disaster recovery measures to minimize downtime and ensure business continuity.

Compatibility and Application Support: DaaS supports a wide range of applications, including both standard and specialized software. Compatibility testing and application packaging are typically performed by the service provider to ensure smooth application delivery. Users can access their familiar applications and tools without compatibility concerns.

Potential Drawbacks:

Internet Connectivity Dependency: DaaS heavily relies on internet connectivity for users to access their virtual desktops. Users in areas with limited or unreliable internet connectivity may experience disruptions or reduced performance. Downtime or network outages can prevent users from accessing their virtual desktops until connectivity is restored.

Performance and Latency: The performance of DaaS is influenced by network latency and bandwidth. Users accessing virtual desktops from remote locations or over long distances may experience latency or sluggishness, especially when working with resource-intensive applications or multimedia content. Optimizing network connections and selecting geographically closer data centers can help mitigate this issue.

Vendor Dependency: Adopting DaaS means relying on a third-party service provider for the infrastructure and management of virtual desktops. Organizations should carefully choose a reputable provider and evaluate their track record, service-level agreements (SLAs), and support capabilities. Vendor lock-in and the potential risks associated with service provider changes should also be considered.

Application Compatibility and Performance: Some applications, especially those with specialized hardware requirements or specific integration needs, may not perform optimally in a DaaS environment. Compatibility testing and performance evaluation should be conducted to ensure that critical applications meet the required performance levels and functionality in a virtualized environment.

Here again, we see the evolution of delivery and compliance from on-premises VDI to DaaS where the overall TCO is lowered.

However, there is still a large learning curve required to achieve compliance and more importantly protect company data and systems from external threats.

Comparison Chart

Meet Apporto

Apporto is a fully managed cloud-based virtual desktop solution that enables users to access their desktop applications and files from any device with an internet connection and a modern browser. It allows organizations to provide a centralized and secure desktop environment to their users without the need for expensive hardware or infrastructure.

One of the key benefits of Apporto is that it eliminates the need for users to install and manage their own software and hardware. It also provides a high level of flexibility, as users can access their virtual desktops from anywhere, at any time, and on any device. Additionally, Apporto offers enhanced security features like best practices for zero-trust, least privilege access, and admin-managed Network Objects.

Users will enjoy a best-in-class user experience for both performance and ease of use, ensuring anywhere access and increased productivity.

Apporto offers a range of pricing plans, including options for educational institutions, businesses, and individuals. It is also easy to set up and use, with no special technical skills required.

Because Apporto is a fully managed service, the effort required to achieve 100% compliance is nearly zero and the cost optimization included with the platform makes total cost of ownership the lowest among all solutions.

As we can see, there is a clear evolution of approach for achieving endpoint compliance from SCCM to VDI to DaaS, with the final step being to Apporto.  We simplify cloud desktops.

WiFi. Browser. Done.

The Most Critical Market Trends in Higher Education – Part 1

The Most Critical Market Trends in Higher Education – Part 1

Through my IT career and now in my sales career, I’ve always been intrigued by which market forces organizations up against and how they are planning to respond. As a Solutions Architect, tracking market trends is part of my role as well as providing advice to customers and potential customers about shifts they could see coming, but also about new exciting technologies.

In this article we look back over the past 12-18 months and a bit into the future, to examine the most important trends facing leaders in higher ed today.

The first trend dates well beyond the 12-18 month timeframe and could go back years or decades: Digital Equity. This has different meanings to different stakeholders on a college or university campus, but can also be defined differently based on what technology is being discussed. In my world, it’s all about access to academic software and compute power. During and following the COVID-19 global pandemic that shut down campuses and businesses everywhere, a spotlight was shone on the need for digital equity.

In higher ed, there are three definitive points for defining digital equity:

  1. Providing support to students using any device and enabling those students to access the software they need to complete their coursework.
  2. Guaranteeing the same experience without the need for extra anything.
  3. Achieving both numbers one and two without any additional financial burdens being passed on to the students.

Let’s dig a little deeper. Being able to support students on any device used to be fairly straightforward by offering physical computer labs. But students are demanding a variety of changes that are quickly making this model obsolete. First, they prefer to use their phones, tablets, and laptops, all of which come in a variety of makes, models, and operating systems. Second, they are looking for more social environments in which to study and learn. Third, they are expecting remote (ie. away from campus) options for learning and access to academic software.

More than 84% of students believe having remote access to computer labs is important and could improve their performance…” 

Source: https://www.splashtop.com/press/splashtop-survey-finds-84-of-university-students-want-remote-access-to-computer-labs

Bring Your Own (BYO)

Another market trend that feeds directly into the digital equity trend is the support of Bring Your Own (BYO) devices. As mentioned above, these devices show up in a variety of form factors running a variety of operating systems. Given that the majority of academic software is still developed for Windows™, providing support to the entire sphere of BYOD gets difficult very quickly.

Arguably the factor causing the most grief to higher ed leaders is Chromebooks. More and more students are graduating from K-12 and primary schools where Chromebooks are widely distributed in an effort to provide digital equity.

A new report from the market researchers at Canalys claims that Chromebook sales surged 275 percent in the first quarter of 2021, dramatically outpacing the PC industry.

Source: https://www.thurrott.com/mobile/chrome-os/249863/report-chromebook-success-continues-into-2021

This sub-trend within the larger BYOD trend makes the entire effort toward digital equity in higher education extremely challenging. So what are the options to move forward?

If we meld these two market trends together (Digital Equity + BYOD Support) how does an institution go about guaranteeing the same experience to all students? Up until recently, there were only two choices: provide laptops to all students (1:1 programs) or implement on-premise VDI (virtual desktop infrastructure.) Both of these solutions represent a hefty budget increase to the school and both will require a lot of staff effort to get up and running successfully and over the long term.

Virtual Computer Lab ROI Calculator

Apporto’s virtual computer labs maximize learning and optimize efficiencies at 50-70% less than the cost of traditional VDI solutions. See for yourself why the Navy and top universities like UCLA and Emory have already discovered by using our Virtual Computer Lab ROI Calculator.
ROI, Return on investment, Business and financial concept.

Luckily, we now have the cloud and easy-to-consume cloud services like Apporto. There are two distinct categories of cloud computing though, do-it-yourself (DIY) solutions and fully managed cloud desktop services. We’re not going to get into the differences, benefits, or drawbacks of these in this article, but watch this space for just such a comparison in the future.

Back to the three points of Digital Equity and looking at number two: guaranteeing the same experience without the need for extra anything. This means not requiring the students to download and install additional software, clients, VPN software, etc.  It also means not forcing students to use a physical lab, or into using loaner laptops, or having to purchase specific hardware. Guaranteeing the same experience from any device means a completely device-agnostic solution, almost like the early days of computing with a dumb terminal.  A good real-world analogy is reading a book from the public library.  You don’t have to pay anything extra, you don’t need any kind of special reading device, and everyone gets the same version (edition) of the book.

Financial Burden

Finally, point three of Digital Equity is not leveraging any undue financial burdens onto students. With tuition costs hitting all-time highs – exceeding $100,000 per year, students are facing immense debt loads upon graduation, and requiring extra expenses in the form of laptops or computers to complete their degrees is just a recipe for more debt. It is much more in alignment with the mission of higher education to provide free access to as many of the resources needed by students as possible and at little or no extra cost.

Source:  https://hechingerreport.org/university-of-chicago-projected-to-be-the-first-u-s-university-to-charge-100000-a-year/

This type of experience for college and university students completing their coursework is critical to their success. As the number of eligible students for admittance continues to decline, an equitable digital or computing experience must be provided to serve students of all backgrounds and situations.

Source: https://www.vox.com/the-highlight/23428166/college-enrollment-population-education-crash

As we conclude our look at part one of the most critical market trends impacting higher ed institutions today, one must wonder is it hopeless?  Thankfully, no. There are good options available to help with these exact challenges and perhaps none better than Apporto.

Happy Computing!

UCISA CIOs Demand Better Vendors – Meet Apporto

CIO Panel Discussion

In late February 2023, a group of leading CIOs from around the UK met online to discuss key points around campus strategies and vendor (supplier) relationships. The event was hosted by UCISA and was intentionally not recorded to give the panel confidence in being open and honest about issues facing their institutions.

Several members of the Apporto Exec Team attended and we’re pleased to share a brief synopsis with our comments here.

Cloud Services, Outsourcing, Managed Services
The first topic on the agenda was Cloud (and cloud services.) The general consensus was that each institution represented had a Cloud-First strategy, but there were some struggles around direction and implementation. Most were in favor of pre-built, consumable cloud services falling in the Software-as-a-Service (SaaS) category. This falls in line with the worldwide market trends Apporto is tracking. Related to the desire for cloud services and the struggle to implement them are the topics of Outsourcing and Managed Services.

    “Feeding and watering tin in the datacenter may have been someone’s job for many years, but there are plenty of partners that can do that better than we can.” -UCISA Member CIO

This comment received widespread agreement from the group and additional comments noted that universities were happy to pay suppliers for this kind of effort and support.

How Apporto is Different
Purpose built for Higher Ed, Apporto’s cloud desktops fall squarely into the Desktops-as-a-Service category, which means we take care of a lot of the heavy lifting and specialist work. This solves the challenge of hiring and retaining qualified system and cloud administrators to build and maintain a Do-It-Yourself (DIY) solution internally. Apporto takes care of all the infrastructure, backup and recovery, monitoring, and maintenance so that your IT staff can concentrate on the strategic tasks and projects that can continue to elevate the rankings of your campus. In addition, our simplified control plane gives customer administrators the tools to quickly make configuration changes and new deployments with just a few clicks, saving valuable time.

As higher ed service delivery specialists, Apporto is the type of vendor the CIOs noted the desire to work with. One CIO noted specifically that solutions offered should solve higher ed challenges and not try to be wedged in, just to make another sale.

Support Agreements, Teams, Coverage
Another important topic concerning the UCISA CIO Panel, was around support agreements, teams, and coverage. Again, Apporto is different in that we only offer one support coverage option: 24x7x365. And, it’s included at no additional cost with every Apporto service subscription. Further, the Customer Success and Support teams are introduced on day one, so that customers can start building relationships immediately – another concern of the panel.

Mergers, Pricing, Budgeting
Moving on to the commercial side of vendor interactions and relationships, there were several hot button topics that the panel spoke about. First was around mergers and acquisitions. A couple of the CIOs noted previous issues with suppliers where development focus or product direction changed as a result. This creates undue work for the university teams to find a replacement or try to adjust their use of a contracted product or service. An overarching theme during this part of the event was “transparency.” All of the panel members echoed each other’s sentiments, that while change in a company after a merger or acquisition is typical, giving advance notice to customers is key. In addition, rather than focusing on new products and upsells, customer relationships should focus on the delivery of products and services that were promised at the start of the contract.

Additional commercial concerns of the UCISA CIO members were price increases, price deadlines, and supply chain issues. In their collective view, a few standout misbehaviors should be avoided. First, setting a deadline on pricing – for example, if you don’t sign the contract by Friday the price will increase. This is not seen as being a collaborative partnership between the supplier and customer. The second is around supply chain issues. Vendors should not promise delivery of a product on a timeline they cannot achieve. This creates undue burden on the customer teams and can create a black mark against a supplier.

Many of the members were quick to chime in that the individual universities are all very interconnected, so if there is a misbehavior or bait-and-switch event with a vendor or supplier, the word will spread. Again, be transparent on known issues to deliver on time or at scale.

The third commercial concern and topic of discussion was about the university budget cycle and price increases. Typically budgets are set 18 months in advance, so suppliers should price accordingly and not seek to leverage price increases immediately. In addition, be transparent about the cause of price increases. A couple of CIOs noted price increases from 3% to 29% with no justification or evidence to explain the jump in cost.

Apporto Provides a Better Commercial Experience
Because Apporto started its business in the higher ed sector, we understand all of these concerns. Not only have we built a technology platform that leverages cost optimization to provide savings for our customers, but our internal company culture is one of transparency and collaboration with our customers. We actively seek feedback through our individual customer engagements and collectively through our bi-annual Customer Town Hall events.

Our pricing model is based on a predictable annual subscription model that is easy to budget for and we limit price increases to the best of our ability. For example, we leveraged our first price increase in 3 years on the 1st of January, 2023 and it was only 6%. This was due to our public cloud providers leveraging the same increase on us due to inflation in the United States. However, even with the increase in price, Apporto remains the most affordable cloud desktop solution for higher ed available today.

Apporto has a 7+ year track record of delighting our customers and we look forward to the next seven years and beyond. While no one knows what the future holds, our current customers rest assured that we will always be transparent with the development of our products and direction of our company.

Carbon Zero, Cyber and Data Security
The final set of topics discussed by the panel included carbon zero initiatives, cybersecurity, and data security. There was some departure between schools on the topic of carbon zero, where Scotland was further ahead than the UK government was with supplier requirements. However, all agreed that it was an important topic and regardless of government mandates, they would welcome vendors who embraced a carbon zero approach to the delivery of products and services.

Cybersecurity and data security, on the other hand, achieved consensus among the CIO panel members. All agreed that these topics should be planned for and addressed by all modern vendors.

The discussion then focused on which certifications suppliers should have in place around cybersecurity. The consensus was for vendors to focus on security frameworks and ensuring timely updates to meet the requirements for Cyber Essentials and Cyber Essentials Plus, yet not ignore GDPR for data regulations.

    “This is nothing new and suppliers should not be surprised by such requirements.” -UCISA Member CIO

An undercurrent during this part of the panel discussion was around co-design with suppliers. Tangential to transparency, all of the CIOs noted the desire for their institutions and internal teams to be included in design discussions with suppliers. Not only is this a desire during the sales process, but also during the contract term so that new feature requests can be incorporated into the product roadmaps.

Security is at the Core
At Apporto, we put cybersecurity and data security at the heart of our products and services. All customer deployments use a zero trust architecture and least privilege approach to data access. We design virtual environments to meet a variety of security controls such as PCI-DSS and HIPAA, and to ensure GDPR compliance. Working collaboratively with our customers, we can architect custom solutions to meet any need or requirement. In addition, our advanced approach to software deployment and 24/7 service ensures that security updates can be deployed to all of our customer cloud desktops within minutes.

Apporto – A True Partner
One of the tenants of Apporto is being a true partner to our customers. This is done through our technology platform and feature set for higher ed. It is done through our service delivery, Customer Success and Support teams, and our company culture of collaboration and transparency. We pride ourselves on being a supplier our customers can sing the praises of, and we welcome all UCISA member institutions to come meet us and see how we’re the type of vendor they’ve been looking for.

A closer look at the possibilities of integrating ChatGPT into curriculums

Welcome to 2023! Perhaps the most exciting technology launch in the past six months is OpenAI’s ChatGPT language model. From the OpenAI website:

“ChatGPT: Optimizing Language Models for Dialogue
We’ve trained a model called ChatGPT which interacts in a conversational way. The dialogue format makes it possible for ChatGPT to answer follow-up questions, admit its mistakes, challenge incorrect premises, and reject inappropriate requests.”

While this has been the buzz in the technology arena, education is somewhat less enthused and quite predictably very polarized.

  • Read: https://www.theguardian.com/australia-news/2023/jan/10/universities-to-return-to-pen-and-paper-exams-after-students-caught-using-ai-to-write-essays
  • Read: https://www.nytimes.com/2023/01/12/technology/chatgpt-schools-teachers.html

However, I think this is the standard response to any new technology in education and requires some thought and effort to most appropriately integrate into curricula.

Thinking back to my own college days, I was one of the first students in my class to reference Internet articles in homework papers. Some professors were on board, others dismissive and required I go to the library for “hard” references. Fast forward 25 years and I completed my Masters degree completely online with a lot of Internet-based references. Got a 4.0, too!
 

So with all this in mind, I launched a ChatGPT session and asked some pointed questions about the use of itself in education. Here’s how it went…


I found this to be a completely reasonable response with valid points. I’ve always favored a critical-thinking approach to teaching and learning, more in favor of teaching students how to find answers, seek help, become self-sufficient, etc. over rote memorization and standardized tests.

Here are some good tips from the University of Washington’s Center for Teaching and Learning: 

  • Set clear policies for the use of AI in specific courses;
  • Communicate the importance of college learning;
  • Assess a student’s process of learning as much as (or more than) the outcome;
  • Acknowledge that the struggle is part of learning;
  • Consider teaching through AI-based tools.

Source: https://teaching.washington.edu/topics/preparing-to-teach/academic-integrity/chatgpt/
 

Continuing my discussion…


Again, valid points. I also like Cherie Shields’ assessment, writing in EdWeek:

“Rather than be wary of ChatGPT, we should embrace how this program can help struggling students learn how to organize their thoughts on paper. By using natural-language processing techniques, this AI tool can “understand” and analyze written or spoken language to generate responses or suggestions. I have used the program to create outlines, templates, and instructions. My experiments have shown me that ChatGPT has the potential to offer students a skeleton with which to begin any number of writing projects.”

Source: https://www.edweek.org/technology/opinion-dont-ban-chatgpt-use-it-as-a-teaching-tool/2023/01
 

So, I asked ChatGPT for some additional detail…


Good to see that this all aligns with the recommendations and assessments we’ve seen so far. As a tech guy, I’m more interested in something outside of language skills and researching term papers…


Very cool! Of course, I wanted some proof…


Not being a programmer myself, I had no way to validate if this was accurate, but it seems good. At the very least, a good way to get started for someone brand new to the discipline.

Others have found the same:

  • https://twitter.com/amasad/status/1598042665375105024
  • https://twitter.com/justinstorre/status/1599483466927984640

And there are some great pieces offering guidance around the use of ChatGPT with coding and programming:

  • https://levelup.gitconnected.com/how-to-use-chatgpt-for-developers-4e7f354bbc02
  • https://cointelegraph.com/news/how-to-improve-your-coding-skills-using-chatgpt
  • https://lablab.ai/t/chatgpt-tutorial-how-to-easily-improve-your-coding-skills-with-chatgpt

 

For my final topic, I went in a different direction: foreign language studies. Here’s what ChatGPT provided…

This is in the realm of what I was expecting based on previous responses. One final question…

I did know that the phrase was wrong, but had no idea in how many ways! One of the fun things about the ChatGPT model is how conversational it is.  Here’s how I closed out my session…

A nice end to my first interaction with ChatGPT and overall, I love the power and promise it holds. Of course there will be challenges going forward, but I’m excited to see how different use-cases evolve and how the depth of knowledge continues to grow around this exciting technology.

At Apporto, we’re keeping a close eye on the sentiment of ChatGPT and other AI models, and have already added ideas to our product roadmap. Some ideas are to block the use of such tools, while others will help enable or embed these technologies into the Apporto service. Regardless of your personal stance, rest assured that Apporto will continue to serve the needs of our customers and users.

Happy Computing!