Technology is always changing. Our guidance and standards must keep pace to stay relevant and useful for departments. The Technology Code of Practice (TCoP) plays an important part in helping government organisations buy and build technology, and is the benchmark for Spend Controls for departmental spending on digital and IT services.
Recently, we’ve done lots of user research to understand how people are using the Technology Code of Practice, and we’re planning to iterate it based on this feedback.
The Technology Code of Practice was last iterated back in 2016. A wide group of industry and government stakeholders helped give us the document we have today. The Technology Leaders Network approved the TCoP, and HM Treasury has mandated it as part of Spend Controls. It is also an important part of the Government Transformation Strategy.
But, the current version of the TCoP was never meant to be the final version. Our aim is to keep iterating, based on feedback and insight from our users.
At the end of last year, we started work on making the TCoP better. We did some initial user research, and are continuing to do more detailed research with as many departments as possible.
So far, we’ve learned that teams across government would like clearer information on what is classed as a ‘standard’, and what we mean by best practice advice. They’d also like more specific examples about implementing the principles.
One important lesson is to consider legacy systems. We often have to buy and build technology in the context of these legacy systems, so while solving each unique issue isn’t practical the guidance will aim to support the main challenges teams face.
We don’t have a monopoly on good practice, so we’ll be working with departments and organisations to look for good implementations of technology that follow our principles.
We’ll be writing about our user research in more depth soon.
The basis of the principles aren’t changing, and we’re still committed to standards that help government buy and build great technology. What we expect will change is the structure of the document as we work on providing greater clarity and more guidance and support for departments. We’ll also be working with Spend Controls to make the principles’ support their work as clearly as possible.
We’re taking our proposals to the next Technology Leader’s Network meeting. As government’s senior technology decision-making body, they’ll play an important role in helping us realise our vision. We aim to have the next iteration out later this summer.
We need you! We can’t improve the Technology Code of Practice without help from departments, so please join us. If you’d like to help, please contact us.
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed.
Back in July 2016, we published a blog post setting out our vision for government contracts in the digital age.
We said that simplifying contracts and hosting them online would encourage a more diverse range of suppliers to apply to offer their services to government. It would also help the Crown Commercial Service (CCS) to provide public sector buyers with the best possible commercial deals for goods and services.
We want contracts that meet the needs of buyers and suppliers, as well as government’s needs. We’d like to share the findings from the discovery phase of our research. In many places, the things we discovered are broader than the initial stated focus on terms and conditions, but they do provide useful context.
The Government Transformation Strategy 2017 to 2020 states that we will renew our focus on transforming the way that the Civil Service and government works. One area of focus is leading a step change in procurement to ensure that user-centred, design-led, data-driven and open approaches are commonplace in contracting by 2020.
This project is an example of how the Government Digital Service (GDS) and CCS are working together to support this change. We’re combining our respective areas of expertise to meet the raised expectations buyers and suppliers, who are the primary users of government contracts.
We contacted the 165 people who had signed up as ‘good contract champions’. This has now risen to 190 and is still open for people interested in making a difference to government contracts. Of the 165 people, 53 were available to speak to us within our timeframe.
The people that we spoke to included buyers, suppliers and legal experts from central and local government, large and small companies, and individual digital specialists from all over the UK.
We wanted to find out what users need from a government contract and the challenges of the current contracting process. We wanted to investigate how model contracts could help make working with government more accessible for a broader range of suppliers. It was also important to determine how model contracts could help buyers get the best value as well as how they could support innovation.
All user research participants wanted contracts to be fit for purpose, legally enforceable, clear and precise. In terms of their specific content needs, there was agreement that the following should be included:
When it came to looking at the current contract process, a number of areas for consideration emerged.
It was important to buyers and suppliers that there’s consistent ownership through the contract lifecycle. This will ensure that knowledge is retained over time and that the contract maintains the same protections and maximum value is realised. With many different government frameworks, to both buyers and suppliers, consistency is vital.
Often a single document has to serve multiple audiences, the content has to be clear and as relevant as it can be to the reader. The structure and design of the contract has to be of a high standard and the language needs to be simple.
When contracts are being agreed, documents are often emailed back and forth. There is a risk of poor version control and peer review issues. Buyers and suppliers need to be able to find the latest version of a contract. Suppliers want to have complete confidence that government knows what it has bought and can understand and manage its contracts.
This is time consuming for buyers and suppliers. For buyers, preparing their requirements, answering suppliers’ questions, evaluating suppliers’ bids and preparing the contract to award, takes time. For suppliers, preparing their bid and asking clarification questions, takes time.
Suppliers said that they want to feel that buyers fully understand their bid.
Suppliers wanted to be confident in the knowledge of the buyer when it came to the procurement process. Time was being spent supporting the buyer through procurement.
When it came to value for money, thinking fell into 2 categories: procurement and contract value for money. In terms of procurement, buyers wanted to feel as though the amount of time spent completing contracts was proportional to the size of the procurement.
While there are merits to the breadth of frameworks, it is important that detail is thoughtfully included and the content is relevant so there are no unnecessary barriers to use.
In addition, it should be a straightforward task to find and use the right route to market.
It is important that we are aware that using the same terms and conditions across a broad category could prove to be problematic. It could lead to high-level, generic definitions: for example, ‘cloud’.
A uniform approach to risk and service level agreements might not be appropriate for the size or nature of the procurement, and may not match the supply market.
Some suppliers also felt that the way that certain clauses are written make it more difficult to comply with them.
As a first step, we want to test some new ideas for ‘model contracts’ for the most commonly used goods and services, reimagined for the 21st century.
Some areas were commonly highlighted in the issues that came up through the contract lifecycle. The alpha scope will define which, if any, are taken forward. We will be blogging about this shortly.
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed.
Follow Warren on Twitter.
In this guest post from MOJ Digital and Technology, Kylie Havelock talks about how the Ministry of Justice recently integrated the 'visit someone in prison' service with their back office systems.
Creating a joined-up justice system is the most important long-term strategic goal for MOJ Digital and Technology and is closely linked to themes within the Government Transformation Strategy.
Although achieving this task is a long-term project, our work is starting to pay off as we’ve recently launched a number of Application Programming Interfaces (APIs) to integrate back office systems with one of our digital services: visit someone in prison.
We built 'visit someone in prison' as part of the GDS transformation programme in 2014 and the service now handles around 1.3million visits a year. It is the biggest transactional service in the Ministry of Justice, allowing family and friends of offenders to maintain contact with their loved ones.
While the service has helped meet important needs for prisoners and their families, prison staff still needed to do jobs like re-entering prisoner data in different systems.
By integrating the service with a large legacy system we could reduce these time-consuming tasks. We knew that doing something this ambitious wasn’t going to be easy and our eventual success depended on 3 important factors.
When the contract for the back-end prison system came up for renewal, we imposed new conditions.
We collaborated with the Standards Assurance team in Cabinet Office to approve the contract extension, which we found really useful in working up a set of conditions which were fundamental in shaping the new service:
Transferring the risk onto the business and insisting on agile working enabled us to remove the usual waterfall-style request for service, inevitable hefty quote, big design up front, and so on.
With the contract in place, we brought together a multi-disciplinary, multi-supplier team working in the same location. Our Technical Architects worked collaboratively with the supplier and became increasingly hands-on in delivering the networking and infrastructure required.
Even with the new arrangements, achieving our single largest goal of connecting to the database meant we had to solve two major challenges:
We initially had to go through multiple approval gates, where people outside the team scrutinised delivery outputs.
To overcome these non-agile approaches, we’re now working with our suppliers to streamline the deployment process and decrease the time to release.
The legacy system we were integrating with used traditional security risk management tools and annual IT health checks processes.
We overcame this by creating a blended approach. The business maintains their familiar process for its legacy system while we managed risk for the new APIs through a continuous security risk management with regular penetration testing by our ethical hackers.
We’ve adopted a thoroughly pragmatic approach throughout. Rather than delaying delivery by starting a lengthy discovery phase, we aimed to learn ‘just enough’ about the legacy system to propose an achievable solution.
Recognising that the risk of failure is higher when working with complex architecture, we started with a low-risk product that would still function adequately without the API.
When we found there was no inbound Internet connection to the data centre, we established an outbound connection using a websocket product then upgrading it to bi-directional communication. This wasn’t just cost-effective but it also saved the team from the engineering effort of developing its own solution.
To deliver value more quickly, we recommissioned some end-of-life servers to provide a virtualization layer we could deploy to. The API has gone live on this hardware while new servers are being commissioned.
Finally, we built the APIs required for our digital service ‘just in time’. Building a general purpose API, rather than one that is specific to our product needs, has meant that other services can make use of the same API calls.
One of main aims is to break down large, siloed systems and enable a microservices future.
As a result, we simplified a complex asynchronous approach, and built an API that allows reading and writing data in real time. Here’s what our solution is based on:
Defining a support model for an evolving, temporary architecture is tricky. The team was able support the integration with Prison Visit Booking service as staff use it during similar working hours to us.
However, as we expand the use of the API, we need to define an appropriate support model. We’ll be looking to the platforms and components guidance to ensure the technology we build can be properly and cost-effectively supported.
So far we have one product using the API, and 3 more digital products that will soon be integrated.
However, we’re planning to expand the number of APIs that will give more services access to the valuable data in currently locked legacy systems. This will help us deliver better digital services.
There’s a long way to go but this work has been an important step in moving towards our goal of a joined-up Justice system that’s faster, simpler and more effective for users.
Kylie is a Product Manager who leads end-to-end service transformation at the Ministry of Justice.
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed.
Follow Kylie on Twitter.
Following on from the success of our sessions with Mastodon C and Facebook last year, we kicked off 2017 with the third in our series of Tech, Digital and Data leaders Seminars. This time we heard from Aditya Agarwal the CTO at Dropbox, a file storage and sharing service, who took part in a “fireside chat” with James Stewart.
As discussed in our recent blog post on Software as a Service (SaaS) we’re keen for government to use more tools that are already available to improve the way we work. This session gave us a greater insight into one of those tools and the current thinking of one provider. It was also a good chance for us to share back some areas where we’d like to see more development.
Interoperability is a term that has a lot of different meanings and we spent some time exploring that. For government interoperability is about removing barriers to entry and reducing lock-in, making sure we’re able to easily use a range of suitable tools. For Dropbox it’s about supporting a wide range of user devices and operating systems, and supporting organic adoption. Both angles can often be dismissed as an idealistic thing. Something which should be aimed for, but in practice is a primary focus for very few companies.
Users often work across a wide variety of platforms so having a tool that transitions seamlessly is ideal. Interoperability also benefits Dropbox by allowing easy, organic growth and allows them to work flexibly across all big tech systems.
MacOS and Windows are the two biggest platforms. Dropbox tests usability of their products across many versions of those two platforms, making sure that both sets of users can understand and easily use the tool. Interestingly, Dropbox found that Mac and Windows users seem to use their devices in very different ways. For example, Dropbox users can simply click and drag a file into their Dropbox to automatically upload it. Mac users tend to be more adept at using that feature than Windows users.
So how do concerns around interoperability affect designing new features? Aditya explained that 60-70% of the time features do work well across multiple platforms. Other times a few minor alterations can rectify any problems.
However there have been some rare occasions where a feature works amazingly well on one platform but just can’t translate across. In those cases the new features have been abandoned.
The group questioned how simple it is to turn off a feature if one user group aren’t using it. It can be painful to turn off a feature on just one version. Aditya pointed out that Dropbox are keen to provide a consistent service to their customers. A Mac user today may be a Windows user next year, or use different platforms at home to in the office. This means it’s very rare for features to differ.
Several years ago internet use started to move towards the use of tablets and smartphones rather than computers. Some companies predicted we’d eventually become ‘mobile only’ and made the decision to focus purely on that market. After a while tech firms realised that wasn’t universally the case.
People are using different devices for different types of activities and for Dropbox mobile use was just a subset of what people do with their primary device. There is still a big market for computers, particularly for work - sending a couple of emails from a mobile is fine, but Dropbox’s observation is that few people would chose to carry out detailed content work from a mobile device.
There needs to be a balance between the two to ensure that functionality isn’t lost when switching between mobile and desktop.
For government, security is always at the forefront of our thinking and concerns about controlling access to sensitive documents can make government departments hesitant to use cloud services.
There’s a cost to that and being overly cautious can prevent productivity.
For both government and organisations like Dropbox an important area of focus will be providing the right interfaces to help users understand the implications of how they use tools (eg. who has access to what files) and providing complete visibility and accountability to IT administrators in ways that help improve security without focussing on prevention.
This is an area where there’s a real need for interoperability - as organisations employ an ever wider range of SaaS tools it’s important that it’s easy for organisations to understand usage without having to build special monitoring for every individual tool.
The session also gave Dropbox the opportunity to learn more about how things work in government.
James Stewart talked about the excitement around the move towards SaaS tools, and some of the particular considerations around using them in the public sector (and often other large organisations). For example, as well as some of the interoperability and security considerations there are also special considerations around how information is archived.
We also talked about some of our common challenges around when to support old devices and operating systems. When it comes to public services, terminating support for out of date technology is not a simple decision.
The least advantaged people in UK only access government online services by using computer in Job Centres and libraries which can be slow to upgrade. Aditya questioned how we make those decisions and we talked about how mechanisms like spend controls and service assessments give us useful insight, as well as how significant security issues like the POODLE attack on SSLv3 can accelerate change.
For both organisations the increasing use of software that automatically upgrades itself is a very positive move that we hope will make things simpler in future.
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed. Follow Emma on Twitter.
In January this year, James Stewart explained how the Technology Leaders Network (TLN) had decided that government was now on a journey away from the Public Services Network (PSN).
For the people and organisations that we talk to across the PSN community, the news didn’t come as much of a surprise: there’s been a world of change in IT trends since the PSN was originally set up more than 10 years ago.
However, the same people told us that they were worried that the PSN compliance process was also set to disappear and, if it did, then that would raise big questions:
So, what’s happening with PSN assurance?
Firstly, the good news is that PSN compliance isn’t going anywhere, certainly for a while yet. The TLN agrees that - as one of the only recognised, externally accredited, cross-government common assurance standards - it needs to live on far beyond the end of the physical network.
As James mentioned in his 'the internet is ok' blog post, government’s move away from the PSN will take some time. There’s currently no timeline as there’s quite a bit of work to do across the public sector to prepare for these changes.
However, as he emphasised in the post, if you’re going to update or change services in the near future, then you should take the opportunity to move them to the internet and secure them appropriately using the best available standards-based approaches.
In the meantime, PSN-connected organisations will still need to continue to meet their assurance requirements if they want to reach the important government and law enforcement services they need.
So, if you’re a PSN-connected organisation or provide a service over the PSN, you’ll need to ensure you continue to demonstrate to us that your organisation’s security arrangements, policies and controls are sufficiently rigorous for us to allow you to interact with the PSN and those connected to it.
That means you’ll need a valid PSN compliance certificate - and do everything you’ve been doing to get one and maintain it - for the foreseeable future.
Well, yes it is for the vast majority of the work that the public sector does but, it’s important to understand that organisations should implement the same security activities whether they are connected to the PSN or any other network, including the internet.
However, as James says in his blog post, it’s going to take some time for the public sector to get to a point when the services it needs to use and the information it needs to access each day can be done over the internet.
We’re working with organisations across government and the public sector and the PSN community, as well as suppliers and service providers, to ensure issues are identified and we’ll work together to provide common solutions. And we’ll be telling you lots more about this as the work progresses.
The PSN compliance process has its roots firmly planted in the need for government departments to understand who is using its data and to make sure they’re using it properly and looking after it.
That scope is much wider today, as it includes an organisation’s security arrangements, policies and controls. However, more importantly, it symbolises a level of trust that’s recognised by everyone across government.
It’s a badge that tells government that an organisation is “doing the right thing” when it comes to their security and that means they can be trusted by everyone: Whether they’re a government department, agency, public body or corporation, devolved administration, local authority, police or criminal justice service, a whole host of service providers or any other organisation across the wider public sector.
Of course, this need for trust won’t go away when government moves away from the PSN network.
It will remain vitally important that organisations across the public sector continue to demonstrate they’re doing the right thing if they’re going to carry on using government services and data.
We’ll need to continue to establish, administer and maintain a level of common trust that will ensure interoperability and interaction with government is preserved.
Today, the National Cyber Security Centre (NCSC) and the Cyber and Government Security Directorate are leading on reducing the risk to the UK by improving its cyber security and cyber resilience. They’ve been impressed with the work that’s taken place around PSN compliance, and are keen that the commitment to “doing security stuff” and maintaining trust across the public sector continues with the same ambition.
So we’ve been looking at ways to expand and reframe PSN compliance in a new context that, while retaining the assurance principles that are the basis of the existing process, will significantly improve the process.
A new context that can tap into the methodology we’ve built for collecting security data; that can make use of the historical data we hold; that can build on the co-operative relationships that we’ve nurtured across the public sector; and, most importantly, make it simpler, quicker and more valued to those who achieve it.
We owe it to ourselves and the PSN community as a whole – who have worked hard to get where we are today – to make it better, and we'll be keeping the PSN community up to date as we go.
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed.
Late last year we held a Technology Leaders Network meeting to look at the government’s use of Software as a Service (SaaS). Our session focussed on busting the common SaaS myths we hear in order to increase adoption and maintain the effectiveness of on-demand software. These are the main 4 myths in no particular order.
Perhaps the biggest myth about SaaS is that it lacks security. There seems to be a common misconception across government that SaaS isn’t as secure as on-premise software. Of course there are some weaknesses but that is true of any technology.
A competent SaaS provider has a large budget for security and can invest heavily in mitigating all common risks. If you have a security breach, a provider has a team available to help you 24 hours a day, 7 days a week available to tackle any issues. It often wouldn’t be cost-effective for Government departments to match such a comprehensive level of service, which is why we’ve recently published some guidance on public sector use of the public cloud.
This myth relates to the idea that SaaS isn’t suitable for big business or government departments . Sometimes cloud services aren’t developed enough for use in government. But there are many areas where SaaS solutions are ideal, such as project management and planning, email and HR.
There are many advantages to using SaaS for major systems, including:
In terms of disadvantages, the government has lots of capital expenditure and this can make it difficult to get your use of SaaS agreed.
If staff aren’t given adequate tools to deliver in their job, then you run the risk of them going out and finding their own alternatives. If this happens then employers have no visibility or control over what is being used. Alternatively, if staff are given access to options that are approved, you can integrate those tools and ensure they are being used securely.
The Common Technology Services (CTS) team recently conducted a survey looking at the most common technology issues encountered by staff across the Civil Service. One of the key issues that came up was being able to find people working on a particular project without knowing their name or job title. Many noted that they had resorted to using websites like LinkedIn to track down colleagues, but not all departments allow access to external sites. At GDS it’s not uncommon to see people reach out via Slack, asking if anyone can point them in the right direction or make an introduction.
Some departments have stated that they haven’t allowed access to tools like Slack because they don’t have the capacity to research them. The Technology Leaders Network agreed to put together guidance on tools they use, outlining their pros, cons and security considerations. Having one centralised piece of guidance that is always being updated will remove the risk for departments using these tools and help allow wider access for civil servants.
To demonstrate this point, we had a live demonstration of a virtual desktop which allows users access to their documents from any supported device. The demo showed that not only is the service cost effective but user friendly and flexible.
The Technology Leaders Network plans to publish more guidance on SaaS products. Giving departments confidence to use SaaS services should help ensure civil servants have the tools they need to do their jobs efficiently.
We would be interested in hearing what SaaS products are helping your organisation. Leave a comment or email us to let us know.
You can also read a summary of the meeting from Jon Lawrence, Technical Director for Assurance at the National Cyber Security Centre.
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed.
There’s been a lot happening since my blogpost last September about the GSi Convergence Framework (GCF) contract. If you’re currently one of the near 600 public sector organisations that currently use this contract for important IT services this update has some important information for you and your organisation.
We’ll be writing to all GCF customers about what’s happening, but I wanted to make sure that we keep all customers updated on the plans as contracts are set to come to an end very soon.
Firstly, from April 2017, Vodafone will only sell its core services as a bundled set. The core services are too interdependent for unbundling to make financial sense for Vodafone or government, particularly following government’s renewed commitment to Cloud First. As a GCF customer, you will need to continue to buy all the core services as a bundle: Vodafone DNS, Mail Relay and Peer-to-Peer, until you stop using all these services.
CCS and GDS are working closely with Vodafone, applying current government network policies and guidance, so that we can reduce and eventually remove the need for these legacy core services. We’ll be working with principal government departments to do this, but we’ll need your help and we’ll contact you soon about this. Vodafone will support these legacy services until March 2019.
From April 2017, the following services will be available via the following commercial routes:
Although Vodafone is still finalising the details of these offers, we do know they will be available as direct award for expediency. We are also working with Vodafone to improve the existing commercial terms. We understand that this information is important for your organisation and we will release further details as soon as they are available to us.
The route away from the legacy core services will be different for each organisation, as it will involve different schedules within the overall time frame of completion by March 2019.
However, as a first-step, it is important that you pre-register to use the National Cyber Security Centre's new UK public sector DNS services, so that you're ready to use them when they go live.
Registration for the UK public sector DNS services involves four steps:
The UK public sector DNS services consists of two elements: a PSN DNS and an internet DNS. If your organisation has direct connections with the internet as well as connections with the PSN, you can register for both at the same time.
We won’t make the PSN DNS live until we've completed a smooth migration of data from the legacy DNS, which will be later than we originally planned. We’re working with Vodafone to make sure we have all the DNS data we need and that the data is simplified to give consistent behaviour for all PSN customers, whichever part of PSN they’re on.
We’ll continue to work with Vodafone and GDS to make progress with the details that still need to be finalised. We’ll also keep you up to date on the go-live dates for each of these services as soon as we can. If you have any questions about any of this, contact CCS, your usual Vodafone point-of-contact or the GDS team.
Tony Brown is the Category Lead at Crown Commercial Service (CCS)
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed.
Unfortunately, the main barrier that prevents departments from investing in these solutions is the contract landscape. Many still have large, legacy contracts using system integrators which affect their ability to change their technology estate. They’re faced with costly change control requests and complicated workarounds to link up cloud-based commodity solutions with their existing technology.
The latest figures from the Complex Transaction Team (CTT) show that there are IT contracts with a value of £3.8bn ending over the course of this Parliament. £1.5 bn of these expire over the next two years. As these contracts come to an end, and the associated exit programmes start, departments have the opportunity to adopt common technology solutions to accelerate their business transformation.
This is where we can help. CTS has been leading an approach involving expertise from across Cabinet Office which includes the Infrastructure and Projects Authority and the CTT.
By talking to departments with one voice we’re well placed to be able to:
We don’t just provide support for major exit programmes. We’re building up a knowledge base of documents and tools which will be useful to programmes of any size. With this constantly evolving knowledge base we can provide smaller programmes with the advice and capability they need to help themselves. It will continue to grow as we work with more exit programmes, learn lessons and identify the best practice that is already happening out there.
In June last year, we began work to assist the HS2 Programme. James Findlay and his team have already been doing a great job migrating to the cloud, but we were confident we could help them achieve even better value and bring their IT estate back ‘in-house’. This started with a series of collaborative workshops where we identified a number of areas where CTS could provide support to help accelerate the programme delivery.
We’ve agreed with HS2 to provide support and help accelerate work in some of their key workstreams. The main areas are:
We were also asked to support their procurement by supplementing the HS2 team with capability from the Cabinet Office. This has proved to be more cost effective for a smaller programme and provides resources they wouldn’t normally have.
By engaging closely with the Cabinet Office Spend Control team during various stages of the programme, the team has helped make sure that conditional approvals were achievable and in-line with the Technology Code of Practice, central to Department for Transport and HS2 strategies.
This is one of the team’s first engagements with a department and it’s been a big learning curve. We’ve learnt a lot about the best way we can approach and support programmes of this scale and nature. HS2 are clearly taking the right approach to their contract exit and we’ve been able to take their good practices into our exit framework to be used by other government departments and agencies.
We’ll use these lessons to further develop the support programme for large expiring contracts and look to support more departments in exiting their contracts in the future.
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed.
On 1st December 2016 we launched the ‘Business Change through Common Technology’ community of interest and we’d like you to join us.
Our aim is to improve how business change is managed across government. This community is for anyone and will have a special focus on the adoption and change associated with the common technologies we are encouraging government to use.
The technologies we focus on include common desktop, common office productivity tools, common HR and Finance systems, common identity and access management and common service management tools - all delivered as services from the cloud. These cloud based services have their own unique barriers to change and will require specific business change focus if they are to be adopted successfully.
So, along with the usual purpose of being a network for learning and sharing good practices, lessons learnt, case studies, tools, techniques, tips and tricks, stories and experience, we’ve decided that the community will:
The community also wants to be active and not just share ideas and documents. We thought a good place to start is to have a common understanding of business change. Business change is more than just training, but what is it really?
The candidate topics are:
Agreeing on what business change means may well include all of the above, and more.
So if these topics interest you and you’d like to explore the profession of business change, please join our community.
The Community is open to all and has a virtual home within the Government Project Delivery Network on Knowledge Hub. You can join directly from Knowledge Hub or email us.
]]>You can sign up now for email updates from the Government Technology blog or subscribe to the feed.