When Your Building Super Is an A.I. Bot

The new maintenance coordinator at an apartment complex in Dallas has been getting kudos from tenants and colleagues for good work and late-night assistance. Previously, the eight people on the property’s staff, managing the buildings’ 814 apartments and town homes, were overworked and putting in more hours than they wanted.

Besides working overtime, the new staff member at the complex, the District at Cypress Waters, is available 24/7 to schedule repair requests and doesn’t take any time off.

That’s because the maintenance coordinator is an artificial intelligence bot that the property manager, Jason Busboom, began using last year. The bot, which sends text messages using the name Matt, takes requests and manages appointments.

The team also has Lisa, the leasing bot that answers questions from prospective tenants, and Hunter, the bot that reminds people to pay rent. Mr. Busboom chose the personalities he wanted for each A.I. assistant: Lisa is professional and informative; Matt is friendly and helpful; and Hunter is stern, needing to sound authoritative when reminding tenants to pay rent.

The technology has freed up valuable time for Mr. Busboom’s human staff, he said, and everyone is now much happier in his or her job. Before, “when someone took vacation, it was very stressful,” he added.

Chatbots — as well as other A.I. tools that can track the use of common areas and monitor energy use, aid construction management and perform other tasks — are becoming more commonplace in property management. The money and time saved by the new technologies could generate $110 billion or more in value for the real estate industry, according to a report released in 2023 by McKinsey Global Institute. But A.I.’s advances and its catapult into public consciousness have also stirred up questions about whether tenants should be informed when they’re interacting with an A.I. bot.

Ray Weng, a software programmer, learned he was dealing with A.I. leasing agents while searching for an apartment in New York last year, when agents in two buildings used the same name and gave the same answers for his questions.

“I’d rather deal with a person,” he said. “It’s a big commitment to sign a lease.”

Some of the apartment tours he took were self-guided, Mr. Weng said, “and if it’s all automated, it feels like they don’t care enough to have a real person talk to me.”

EliseAI, a software company based in New York whose virtual assistants are used by owners of nearly 2.5 million apartments across the United States, including some operated by the property management company Greystar, is focused on making its assistants as humanlike as possible, said Minna Song, the chief executive of EliseAI. Aside from being available through chat, text and email, the bots can interact with tenants via voice and can have different accents.

The virtual assistants that help with maintenance requests can ask follow-up questions like verifying which sink needs to be fixed in case a tenant isn’t available when the repair is being done, Ms. Song said, and some are beginning to help renters troubleshoot maintenance issues on their own. Tenants with a leaky toilet, for example, may receive a message with a video showing them where the water shut-off valve is and how to use it while they wait for a plumber.

The technology is so good at carrying on a conversation and asking follow-up questions that tenants often mistake the A.I. assistant for a human. “People come to the leasing office and ask for Elise by name,” Ms. Song said, adding that tenants have texted the chatbot to meet for coffee, told managers that Elise deserved a raise and even dropped off gift cards for the chatbot.

Not telling customers that they’ve been interacting with a bot is risky. Duri Long, an assistant professor of communication studies at Northwestern University, said it could make some people lose trust in the company using the technology.

Alex John London, a professor of ethics and computational technologies at Carnegie Mellon University, said people could view the deception as disrespectful.

“All things considered, it is better to have your bot announce at the beginning that it is a computer assistant,” Dr. London said.

Ms. Song said it was up to each company to monitor evolving legal standards and be thoughtful about what it told consumers. A vast majority of states do not have laws that require the disclosure of the use of A.I. in communicating with a human, and the laws that do exist primarily relate to influencing voting and sales, so a bot used for maintenance-scheduling or rent-reminding wouldn’t have to be disclosed to customers. (The District at Cypress Waters does not tell tenants and prospective tenants that they’re interacting with an A.I. bot.)

Another risk involves the information that the A.I. is generating. Milena Petrova, an associate professor who teaches real estate and corporate finance at Syracuse University, said humans needed to be “involved to be able to critically analyze any results,” especially for any interaction outside the most simple and common ones.

Sandeep Dave, chief digital and technology officer of CBRE, a real estate services firm, said it didn’t help that the A.I. “comes across as very confident, so people will tend to believe it.”

Marshal Davis, who manages real estate and a real estate technology consulting company, monitors the A.I. system he created to help his two office workers answer the 30 to 50 calls they receive daily at a 160-apartment complex in Houston. The chatbot is good at answering straightforward questions, like those about rent payment procedures or details about available apartments, Mr. Davis said. But on more complicated issues, the system can “answer how it thinks it should and not necessarily how you want it to,” he said.

Mr. Davis records most calls, runs them through another A.I. tool to summarize them and then listens to the ones that seem problematic — like “when the A.I. says, ‘Customer voiced frustration,’” he said — to understand how to improve the system.

Some tenants aren’t completely sold. Jillian Pendergast interacted with bots last year while searching for an apartment in San Diego. “They’re fine for booking appointments,” she said, but dealing with A.I. assistants instead of humans can get frustrating when they start repeating responses.

“I can see the potential, but I feel like they are still in the trial-and-error phase,” Ms. Pendergast said.

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top