About Us & The Work We Undertake
Health & Disability Systems, Social & Emotional Wellbeing of People & Community
Employment & Workplace, Skills Development, Early Childhood Services, Education
Aged Care, Seniors Living, Retirement Living, Active Ageing, Ageing in Place
|
Urbis has the unique ability to bring together social researchers, social planners and economists to provide holistic solutions for our clients. We conduct research and provide strategic advice to guide decision making and support the implementation and evaluation of policies, programs and projects.
Our consultants have worked with government, private sector and NGO clients in a variety of areas including: health, disability, community safety, education, environment, defense, employment, housing, justice, infrastructure and transport.
We develop positive working relationships with clients and work in partnership with them to understand their unique needs and help them respond to pressures and ‘wicked problems’.
Alison Wallace National Director – Public Policy 02 8233 9914 awallace@urbis.com.au |
|
Susan Rudland Director – Social Planning 02 8233-9903 srudland@urbis.com.au |
|
Dr. Linda Kurti Director – Public Policy 02 8233 9947 lkurti@urbis.com.au |
|
Claire Grealy Director – Public Policy 03 8663 4858 cgrealy@urbis.com.au |
Nicki Hutley Director – Economic Advisory 02 8233 9910 nhutleye@urbis.com.au |
|
Jane Homewood Director – Social Planning 03 8663 4936 jhomewood@urbis.com.au |
|
Stephanie Wyeth Director – Social Planning 0730073826 swyeth@urbis.com.au |
Urbis provides robust and independent economic advice that is used by governments to optimise policy outcomes, and by organisations to enhance business and investment strategies.
Organisations are challenged on a daily basis at the domestic and global economic landscape due to changes within the policy environment in which they operate. Urbis has extensive experience in assisting peak bodies, not-for-profit and businesses to better understand and negotiate the economic, financial and policy terrain, and deliver outstanding results to stakeholders.
For more information contact
LEAD DIRECTOR – ECONOMIC ADVISORY PORTFOLIO
Nicki Hutley Director – Economic Advisory 02 8233 9910 nhutley@urbis.com.au |
Urbis specialises in research and advice to clients within the health sector, including federal government, state government and non-government organisations. We provide our clients with an understanding of the performance of health systems and the impacts on the social and emotional wellbeing of people and the community.
Mental health and wellbeing
Health
For more information contact
LEAD DIRECTOR – HEALTH PORTFOLIO
Dr. Linda Kurti Director – Public Policy 02 8233 9947 lkurti@urbis.com.au |
Great cities require strategic thinking and robust analysis to shape the places, spaces and communities of the future. Our multidisciplinary team of social planners and engagement experts develop smart, sophisticated and pragmatic solutions to address key social, economic, urban sustainability and development challenges.
We are specialists in strategic planning and master planning for cities, communities and activity centres. We have delivered over 200 engagement and place making processes for the private sector and all levels of government. Our team showcases expertise in sustainability and place-based planning that requires integrated and multidisciplinary participation for great places and spaces that are vital, engaging and prosperous.
We tailor and adapt our approach depending on project requirements, stakeholder interests, community values and project risks. We work across private and public sectors to provide evidence-based social planning analysis and research that offers certainty, clarity, commercial and community benefit.
For more information contact
LEAD DIRECTOR – URBAN POLICY AND INFRASTRUCTURE PORTFOLIO
Susan Rudland Director – Social Planning 02 8233 9903 srudland@urbis.com.au |
Australia is undergoing an unprecedented era of reform in disability, and Urbis is working with a range of government and non-government agencies to support and evaluate new approaches to disability access and inclusion.
For more information contact
LEAD DIRECTOR – DISABILITY PORTFOLIO
Alison Wallace National Director – Public Policy 02 8233 9914 awallace@urbis.com.au |
We are experts in undertaking research to understand and strengthen organisational, sectoral and national productivity. Our team work across the employment, skills development, early childhood and education sectors to assist federal and state clients to understand community, business and stakeholder perspectives, measure outcomes and impacts.
Employment and workforce
Early childhood
School education and infrastructure
For more information contact
LEAD DIRECTOR – EDUCATION AND WORKFORCE PORTFOLIO
|
Urbis provides a range of expert research and advisory services that work towards improving community safety, in the public and private domains.
This includes:
We have on our team social planners, policy analysts and program evaluators who bring a range of technical expertise and experience to the task.
For more information contact
LEAD DIRECTOR – COMMUNITY SAFETY AND SECURITY PORTFOLIO
Jane Homewood Director – Social Planning 03 8663 4936 jhomewood@urbis.com.au |
Cutting across social planning, urban planning and design, real estate and social infrastructure advisory, Urbis has a long history of conducting research and developing market and consumer responsive solutions for the ageing and retirement sector.
We work with clients to understand emerging trends, commercial drivers, locality specific community needs and demands to ensure that policy, practice and planning are implemented from an evidence-based perspective.
For more information contact
LEAD DIRECTOR – HEALTHY AGEING AND SENIORS LIVING PORTFOLIO
Stephanie Wyeth Director – Social Planning 07 3007 3825 swyeth@urbis.com.au |
Lucy Band has joined the Sydney Team as a Senior Consultant. Lucy has just returned from the UK where she worked at the City of London in corporate communications and engagement. Prior to that, Lucy was Communications Manager at the Green Building Council of Australia, across multiple channels including digital, social media, print and face to face engagement. Lucy also worked in stakeholder engagement at Parsons Brinkerhoff for major infrastructure projects, including water, energy and transport infrastructure.
Guillermo Umana has also joined the Sydney team us as Graduate Consultant. Guillermo previously worked for Urbis as a GIS analyst. Guillermo has a Bachelor of Planning and is currently completing a Master of Laws (planning and environmental law) at Macquarie University. He wishes to build on his interest in social planning, social impact assessment, open space and facilities assessment and sustainable development. We look forward to welcoming Guillermo back to Urbis in a new role.
We are also thrilled to welcome Elaine Henderson into our Melbourne team as a Team Administrator. Elaine has worked over the last several years for the State Trustee, and then as a corporate EA for Accountants and then with a Construction company. Her skills with excel, familiarity with a broad range of business systems and in transcription are highly valued and we welcome her to team.
Urbis has been successful with our application for the prequalification under the Government Architect’s Strategy & Design Excellence Prequalification Scheme:
We were successful for the following Strategy Capability areas:
The ESA Sydney team recently completed an important research project for the City of Sydney which aims to shed light on the health of Sydney’s night-time economy. This was recently published and covered extensively by the media.
For more information please click here.
For a link to the full report please click here.
Urbis is partnering with the Aboriginal and Torres Strait Islander Healing Foundation, Aboriginal consultants Karen Milward (Karen Milward Consulting), Graham Gee and Alan Thorpe (Dardi Munwurro), and renowned forensic clinical psychologist Professor James Ogloff, to deliver the 2016-2018 evaluation of the first Aboriginal Social and Emotional Wellbeing Plan for Victorian prisoners.
The Economic and Social Advisory group at Urbis is very much looking forward to working closely with such a terrific project team and with our commissioning partners within the Department of Justice and Regulation – Justice Health and Corrections Victoria.
For more information please click here.
Urbis’ extensive consultation processes have led to the inclusion of 14 recommendations in the final report handed down in the 2015 Review of the Disability Standards for Education 2005.
The Department of Education and Training engaged Urbis to undertake the review of the Standards, which occurs every five years to help clarify the obligations of Australian education and training providers so that students with disabilities can access and participate on the same basis as other students.
Urbis Director Claire Grealy says that the Disability Standards for Education reflect the national priority of access to education for all people.
“Our team is proud to have contributed to the next phase of work in realising this objective,” Ms Grealy said.
The 2015 review recognised that progress has been made since the 2010 review in raising awareness of the Standards with educators through various initiatives and resources.
The Australian Government’s intentions are included in the initial response, available here.
Urbis’ Economics and Social Advisory team led by Directors Claire Grealy and Linda Kurti, also included Associate Directors Julian Thomas and Poppy Wise, Senior Consultants Joanna Farmer and Cathy Baldwin, with research assistants Christina Griffiths and Olivia Killeen.
Recent discussions have reflected on the extensive engagement we’ve had as evaluators in the last few years in integration, co-location and collaboration strategies in a range of systems. As a consequence, we’ve looked for ways to monitor these at times complex beasts, using the presence of success factors as proxies for effective longer-term outcomes beyond the timeframe of the average commissioned evaluation. To monitor, one needs to know what one is looking for.
Helpfully, success factors in joined up systems are well documented. Inter-sectoral collaborations have been designed to address complex social problems, suggesting that there are four essential prerequisites for effective inter-organisational collaboration:
In a review of co-location and integration investments we conducted in the education context, we observed both successes and challenges for the players in striking the right balance of structure and flexibility; maintaining strong leadership throughout; and securing the right resources and personnel to deliver results. This experience is reflected in the literature, where the success factors fall into the following categories.
Grass roots engagement and planning: development of initiatives from the bottom-up, through extensive and ongoing consultation with community about local needs, requires follow-through to ensure the initiative responds flexibly to those needs. Success is also incumbent on the investment in partnerships and engaging the ‘right’ stakeholders – which also means the right number of stakeholders – as determined by the needs of the initiative.
Solid structures to support engagement: success in evaluation means a shared vision, commitment and clearly defined roles and responsibilities agreed between all stakeholders. A strong governance structure involving all key stakeholders, such as individuals with the skills, knowledge and traits required, and with the authority to make decisions is also important. Early attention to change management planning, as well as spending the appropriate time in order to develop strong, shared plans for the project are vital factors, leading to agreed processes for monitoring progress and measuring success.
The right people: successful outcomes require strong and effective leadership. Leaders, managers and staff need the ‘right’ skills and personal traits, including the ability to work collaboratively, to work outside traditional sectoral boundaries, and to work creatively to identify mutually agreeable solutions to complex problems. Support and training for staff is required, recognising the fact that co-location and integration will be a new way of working for many.
Appropriate resourcing: recognition and acceptance that co-location and integration will take time to deliver results, and that investment in building community capacity to engage in the learning process, including building relationships and an understanding of the ways that community members can contribute is required. Secure resources, such as time, funding and personnel, both in the short and longer term.
Areas of conflict are also documented, as conflict can critically impact on the success of a collaborative effort. A review of international experiences from the USA, UK, Canada and Australia found that these factors presented challenges to integration and that overcoming them is critical to success:
Inclusiveness: expectations of inclusion in planning, managing and overseeing a collaborative effort makes maintaining engagement challenging and may mean progress is slow or slower than initially anticipated.
Differences in Power: Where projects involve collaboration between large and small organisations, or organisations and individuals, perceptions of power imbalance can cause smaller groups to feel that the effort is being dominated by the larger ones. This can cause smaller organisations to be less engaged with the project.
Differences in Professional Values, Ethics and Priorities: Collaboration between agencies from different sectors can create tensions regarding the cultural values, ethics and priorities placed on outcomes or processes. The success of collaborative initiatives is significantly influenced by the capacity of agencies and individuals to develop mutual respectful.
Differences in Agency Commitment: Where involvement in an integration project is driven by policy, regulatory, funding, or other large-scale change, the commitment of agencies — and individuals within those agencies — to the project can differ. This can cause delays in gaining agreement between all parties.
Differences in Agency Priorities and Planning Mechanisms: Organisations used to operating within a defined scope may have difficulty broadening it to include the integrated goals and priorities. In particular, working through funding, regulatory and governance issues related to an expansion of focus can be critical to project success.
Time and Other Resources: Although improved efficiency is often a key goal of integration initiatives, collaboration and joint agreement can take significant time and resources to develop, especially in the early stages. (Valentine et al., 2007).
So we know the dimensions of success, points of tension, and failure – at the individual network or partnership level – how and what do we measure?
We have developed a range of tools to track the development and assess the breadth and effectiveness of networks over many years, and collected data at several points in time. We then tried to discern a pattern of activity and from there, the utility of the network or partnership.
Sometimes, insights emerged (that mostly, the data was unreliable and didn’t lend itself to why a network self-reported effectiveness or otherwise). Often, our approach relied on the same people being involved over a period of time or the same level of knowledge being held at the time of our inquiry.
The Partnership Rubric
“The rubric is designed to make sense of the conceptual and professional complexities which accompany the calls for collaboration. It offers both a developmental model of collaboration and a practical tool for individual organisations and networks to analyse their existing collaborative efforts and to plan for future success.”
Fortuitously, my co-researchers Dr Gail Winkworth and Michael White had been applying themselves to this question for some time, and we’d had conversations over the last few years about their work. Underpinning their work are two critical insights: client need should determine the level of collaboration, and there are three ‘must have’ drivers of collaboration:
Based on these insights, the Partnership Rubric has been developed as a matrix, with 18 Enabling Factors – assigned to each of the three conditions, linked to the four Types of Collaboration:
Answering questions about capability, authority and purpose will reveal the level of complexity and therefore effort, resources, leadership, and players needed for the network or partnership to succeed. It will also reveal whether, in fact, you have a network or partnership – this is a pre-condition to the Rubric being useful.
The Partnership Rubric was developed by Gail and Michael over many projects across different organisational contexts, including Income Security, Education, Child Protection, Family Support, Juvenile Justice, Employment, Family Relationships, Family Law. Through it they have been able to locate the ‘health’ of a network, over time against these key indicators.
How does the Rubric work?
It is both a tool and an action learning cycle, specifically designed to provide networks with a feedback loop on their strengths and weaknesses. The Rubric is administered through a series of surveys, and the results are used to plan the next steps in strengthening the network in alignment with the purpose. The questions are tailored at the commencement of the project to the language and nuances of a particular setting or sector.
As an action learning tool, it provides networks with a step-by-step ‘diagnosis’ of where they sit in relation to all the factors that influence effectiveness. Administered over time, and completed by every network member, it shows progress – or not, cutting through the anecdotal perceptions of performance, revealing the contradictions that so often lay within our self-reported confidence in the effectiveness of interagency efforts.
Rubric data can be presented graphically to assist leaders in communicating with their staff and the network itself to plan future reforms. The charts provide:
Currently, Urbis’ Economic and Social Advisory practice is using the Rubric in two evaluations and it is providing insights into the players’ perceptions of their effectiveness as a collaborating group of diverse professionals. As an example, a recent baseline survey reflected the common over-reporting of network function – with very high levels of confidence in their effectiveness, suggesting they are operating at very high levels of collaboration.
And, when asked what are the top three dimensions that need attention to strengthen your network? ‘Know what each other does’ – which is a critical pre-condition to effective collaboration. With feedback on these results, networks can see where they are truly thriving and where some basic building blocks may need re-visiting.
Conclusion
The rubric is designed to make sense of the conceptual and professional complexities which accompany the calls for collaboration. It offers both a developmental model of collaboration and a practical tool for individual organisations and networks to analyse their existing collaborative efforts and to plan for future success.
Click here to download a copy of the presentation
This is an adapted version of a paper presented at the Australasian Evaluation Society (AES) conference, held in Melbourne on 9 September 2015, entitled ‘Describing is good: measuring is better – a new means of measuring the effectiveness of networks and simultaneously strengthening their function’, which Claire co-authored with Dr Gail Winkworth and Michael White.
References
Evaluation is commonly distinguished from research through its focus on improvement of the subject program, while research seeks to test a theory or hypothesis. Where research seeks to be values-free, evaluators apply evaluative judgement to developing findings and recommendations from the available data.
Making evaluative judgements about program outcomes is known to be an inherently values-laden exercise, yet conscious examination of the values influencing program and evaluation design is not always evident. The values held by those designing programs and commissioning evaluations are often poorly articulated, yet these values influence the definition of program outcomes, how success is conceptualised, and which forms of evidence are given greater credence.
Similarly, evaluation practitioners bring their own values to bear on the development of evaluative questions and methods, the interpretation of data and the generation of findings and recommendations.
The values held by stakeholders in our evaluation work are deeply influential to their perspectives on what matters and what information gets priority.
For example:
Linda and I reflected on a typical scenario for evaluators which illustrates the multiple values frames which are relevant to our practice.
Independent evaluators exercise judgement about programs designed/delivered by providers for beneficiaries and commonly on behalf of funding organisations who are commissioning evaluation in compliance with accountability requirements established by third party central agencies.
Consider, for example, a program targeting Aboriginal children, delivered through a public hospital, and evaluated by a commercial evaluator for the Department of Health to guidelines established by Treasury.
We also explored two ‘case studies’ of value frames influencing evaluative practice.
The first was the concept of Public Value developed by Mark Moore.
Public Value is the value created for the public by public institutions, and is analogous to shareholder value in a corporate setting. The idea of public value gives prominence to trust, service and outcomes.
The Public Value paradigm illustrates difference in business vs public service in the values held (commercial vs social) and the value generated (financial vs socio-political). For example, fairness and procedural justice are generally not commercial values but central to public service – this has implications for the focus of evaluative effort.
Our second case study looked at the Social Return On Investment (SROI) approach to measuring value creation.
Traditional cost-benefit analysis (CBA) implicitly gives more weight to readily measurable economic metrics (avoided future costs; lifetime earnings; carbon emissions etc). SROI is a form of cost-benefit analysis which expands CBA scope to place emphasis on capturing social and beneficiary defined value.
While both SROI and CBA provide a ratio of value to cost, the use of different approaches to determine what and how value is included in the calculations reflects a difference in the underlying values frame that is informing an evaluator’s choice of data collection method.
Ultimately, we argued that as evaluators who apply judgement to evidence and data, we need to explicitly consider whose values are driving overall evaluation focus, questions and design. Similarly, in choosing methods of data collection, it is important to consider what evidence/data is being prioritised and why.
Doing so will strengthen the legitimacy and transparency of evaluation.
Click here to download a copy of the presentation
This is an adapted version of a paper presented at the Australasian Evaluation Society (AES) conference, entitled ‘Drawing out the values’, co-presented with Urbis Economic and Social Advisory Director Linda Kurti, in Melbourne on 6 September 2015. This article originally appeared on LinkedIn.
I’m relatively new to the field of evaluation – I’ve only been practising for 18 months after an early career foray into health and education research.
I absolutely love working as an evaluator, and have been incredibly lucky to work on some exciting and innovative projects during my time at Urbis.
This piece is partly inspired by some of the frustrations I’ve had adapting from life as a policy researcher with a high degree of freedom to one where I have to fit within the confines of an industry that I think inherently constrains innovation. I offer few answers, but some thoughts on how we can work, as evaluators, commissioners and as an industry to build a courageous climate in evaluation.
Before I begin proper, two caveats. The first is that I do not intend to define innovation. This is for three reasons: I don’t think it’s actually necessary to do so for the logic of my arguments; I don’t think it’s helpful to explicitly define a concept which is based on the idea of stretching current conceptual understandings; and, finally, because I’m genuinely not sure on what definition I could arrive at even after contemplating this issue for months.
My second caveat is that I write from my professional perspective as an evaluator, that is, from within an independent, commercial evaluation consultancy working almost-exclusively with public sector clients. That said, I think my thoughts are transferrable to a diversity of clients and commissioning contexts.
But why care about innovation anyway? It’s of value to the evaluation, to the field, and to the evaluator. Innovation can help to deliver new insights on a program or policy. Often innovation emerges as a response to a specific problem in a specific situation, and the innovation consequently is of high value. New ideas can lead to new processes that lead to more efficient and effective evaluation approaches. This is beneficial to the evaluation community by furthering the sector, both in a practical sense and in an academic sense, where knowledge is developed for knowledge’s sake (something often regarded as frivolous, but I think vital to living in a more interesting world).
Finally, I want to work on fun and interesting evaluations, not operate as the evaluation machine. Innovation helps me, us as evaluators, to engage more deeply in the work, and to have a purpose for getting up in the morning to go to work. In turn, higher engagement with the evaluation process by the evaluator leads to better work.
But there are significant structural barriers in the evaluation industry as it currently stands which hinder innovative practice. In the context of evaluation, I think these barriers occur at three key points in the innovation cycle: the generating of ideas, the selecting of ideas, and the implementation of ideas.
The first barrier to innovation is perhaps the most obvious – why make the effort to do something new when the way you are doing things already ‘works’? I think this is a particular concern in the evaluation commissioning world, as independent evaluators tend to conservatism in order to win jobs. If there is a general acceptance of the methods and processes that work, there is no incentive to act as a first mover proposing new ideas when tendering for a job. This is especially the case when we work in a fixed fee world, where there is little financial incentive to do something different – the large financial pay offs that exist in some private sectors simply don’t exist here.
When it comes to selecting ideas, even if evaluators take the risks of proposing new concepts, the commissioner might not take the risk of accepting them. At the moral level, this is because public servants act as delegates of the community to provide services and evaluations. In the private sector, innovators negotiate openly with investors to literally buy in to their ideas. The public have not bought in to the idea of public servants gambling with their money on innovations, which inherently carry risks. By and large, public servants respect this by taking a risk averse approach.
And the risks are significant. Particularly, accountability unbalances the risk/reward ratio of innovation. Sadly, we rarely get rewarded when things go right, and this is the case in evaluation. This creates a situation where if we do the status quo, it works and nobody comments. We do the new innovation, it works and nobody comments. We do the new innovation, it fails, and a furore is unleashed. Thus the risks are high, but the rewards of innovation are uncertain and not necessarily held personally in the same way that backing of innovation in the private sector leads to potential financial reward. Often the perception of risk is enough to put people off, even if the likelihood of the risk eventuating is minimal.
Effective evaluation addresses program and policy implications in a way that is of benefit, not just to evaluation commissioners but also to all stakeholders in the program. Evaluation findings need to be acceptable, and this often means that the evaluation process has to have legitimacy in their eyes too. Therefore, even in situations where we have courageous clients, it may be the case that they don’t have courageous stakeholders, stymying the innovation process once again.
So how do we overcome these barriers? As evaluators, we have a key role to play in selling our ideas to commissioners of evaluation. From the outset of tendering, we have to act in ways that avoid hubris and reassure clients that our approaches will, on balance, deliver benefits that outweigh innovation risks. Explain why you think the approach will work, and be willing to compromise if necessary to minimise risks while maintaining the benefits of the evaluation. Compromise is important as we learn from clients, who are the subject matter experts in the evaluation. Often they will identify risks to the innovation that we won’t recognise because they operate at a different level of knowledge, and we need to accept that.
Commissioners should have faith that the evaluator is acting in their best interest, and as masters of method, offer their perspectives on how best to minimise risk in evaluation approaches. Increasingly it seems that commissioners do understand the importance of innovation, but are incentivising it in an unhelpful way. For example, many Victorian tenders now require evaluators to complete a separate section outlining their specific innovative contributions. I think this is an unhelpful lens through which to view innovation. Innovation is not always necessary – sometimes we don’t need new, we just need things done well. And sometime innovation is the path to ensuring that evaluation is done well. It’s important then, to view innovation as part of the process of evaluation improvement, not an ‘add on’ to be considered separately.
From a practical perspective, there may be changes in approach needed to accommodate innovation. Risk is a consistent theme in innovation, and risk requires risk management. Innovative approaches often need more project management in order to succeed, and to reassure commissioners and stakeholders throughout the process.
Often this reassurance comes in the form of partnership with others. Many of the things that we as evaluators view as innovations are accepted practice in other disciplines. One of the key strengths of evaluation in my mind is that it draws on theory and practice from a range of other fields to deliver the best answers to evaluation questions. Harness the power of others, and your innovation might not seem so risky after all.
My final thought is that amongst all this, there is a key role for us as evaluators, and the AES, as advocates for the potential of evaluation. We need to make innovation the norm in evaluation. We need to support client to become more courageous.
Click here to download a copy of the presentation
This is an adapted version of a paper presented at the Australasian Evaluation Society (AES) conference, held in Melbourne on 6 September 2015. This article originally appeared on LinkedIn.